Drake, Adam's Publications (detailed list)

THIS PAGE IS NO LONGER MAINTAINED. Click here for our new publications list, which is more up-to-date.


This page contains the titles and abstracts of papers written by author Drake, Adam, a member of the BYU Neural Networks and Machine Learning (NNML) Research Group. Postscript files are available for most papers. A more concise list is available.

To view the entire list in one page, click here.


Search Techniques for Fourier-Based Learning

  • Authors: Adam Drake and Dan Ventura
  • Abstract: Fourier-based learning algorithms rely on being able to efficiently find the large coefficients of a function’s spectral representation. In this paper we introduce and analyze techniques for finding large coefficients. We show how a previously introduced search technique can be generalized from the Boolean case to the real-valued case, and we apply it in branch-and-bound and beam search algorithms that have significant advantages over the best-first algorithm in which the technique was originally introduced.
  • Reference: In Proceedings of the International Joint Conference on Artificial Intelligence, pages 1040–1045, July 2009. (First appeared in Proceedings of the AAAI Workshop on Search in Artificial Intelligence and Robotics, 2008).
  • BibTeX:
    @inproceedings{drake2009a,
    author = {Drake, Adam and Ventura, Dan},
    title = {Search Techniques for {F}ourier-Based Learning},
    booktitle = {Proceedings of the International Joint Conference on Artificial Intelligence},
    pages = {1040--1045},
    month = {July},
    year = {2009},
    note = {(First appeared in Proceedings of the {AAAI} Workshop on Search in Artificial Intelligence and Robotics, 2008)},
    }
  • Download the file: pdf

Sentiment Regression: Using Real-Valued Scores to Summarize Overall Document Sentiment

  • Authors: Adam Drake and Eric Ringger and Dan Ventura
  • Abstract: In this paper, we consider a sentiment regression problem: summarizing the overall sentiment of a review with a real-valued score. Empirical results on a set of labeled reviews show that real-valued sentiment modeling is feasible, as several algorithms improve upon baseline performance. We also analyze performance as the granularity of the classification problem moves from two-class (positive vs. negative) towards infinite-class (real-valued).
  • Reference: In Proceedings of the IEEE International Conference on Semantic Computing, pages 152–157, August 2008.
  • BibTeX:
    @inproceedings{drv.icsc2008,
    author = {Drake, Adam and Ringger, Eric and Ventura, Dan},
    title = {Sentiment Regression: Using Real-Valued Scores to Summarize Overall Document Sentiment},
    booktitle = {Proceedings of the {IEEE} International Conference on Semantic Computing},
    pages = {152--157},
    month = {August},
    year = {2008},
    }
  • Download the file: pdf

Comparing High-Order Boolean Features

  • Authors: Adam Drake and Dan Ventura
  • Abstract: Many learning algorithms attempt, either explicitly or implicitly, to discover useful high-order features. When considering all possible functions that could be encountered, no particular type of high-order feature should be more useful than any other. However, this paper presents arguments and empirical results that suggest that for the learning problems typically encountered in practice, some high-order features may be more useful than others.
  • Reference: In Proceedings of the Joint Conference on Information Sciences, pages 428–431, July 2005.
  • BibTeX:
    @inproceedings{drake.ventura.jcis2005,
    author = {Drake, Adam and Ventura, Dan},
    title = {Comparing High-Order Boolean Features},
    booktitle = {Proceedings of the Joint Conference on Information Sciences},
    pages = {428--431},
    month = {July},
    year = {2005},
    }
  • Download the file: pdf

A Practical Generalization of Fourier-Based Learning

  • Authors: Adam Drake and Dan Ventura
  • Abstract: This paper presents a search algorithm for finding functions that are highly correlated with an arbitrary set of data. The functions found by the search can be used to approximate the unknown function that generated the data. A special case of this approach is a method for learning Fourier representations. Empirical results demonstrate that on typical real-world problems the most highly correlated functions can be found very quickly, while combinations of these functions provide good approximations of the unknown function.
  • Reference: In ICML ’05: Proceedings of the 22nd International Conference on Machine Learning, pages 185–192, New York, NY, USA, 2005. ACM Press.
  • BibTeX:
    @inproceedings{drake.ventura.icml2005,
    author = {Drake, Adam and Ventura, Dan},
    title = {A Practical Generalization of Fourier-Based Learning},
    booktitle = {ICML '05: Proceedings of the 22nd International Conference on Machine Learning},
    pages = {185--192},
    publisher = {ACM Press},
    address = {New York, NY, USA},
    year = {2005},
    }
  • Download the file: pdf

Valid XHTML 1.0 Strict Valid CSS!