Notice: Undefined variable: pageDisplayTitle in /var/www/template.inc on line 7

Notice: Undefined variable: page_logoImage in /var/www/template.inc on line 9

Notice: Undefined variable: site_logoWidth in /var/www/template.inc on line 11
<br /> <b>Notice</b>: Undefined variable: noHierarchyInTitle in <b>/var/www/template.inc</b> on line <b>17</b><br /> Gashler, Michael S.'s Publications (detailed list) - NNML Laboratory - BYU CS Department
Notice: Undefined variable: site_icon in /var/www/template.inc on line 23

Notice: Undefined variable: page_style in /var/www/template.inc on line 42

Notice: Undefined variable: pageStrippable in /var/www/template.inc on line 50

Notice: Undefined variable: site_titleStack in /var/www/template.inc on line 70
  Gashler, Michael S.'s Publications (detailed list)

THIS PAGE IS NO LONGER MAINTAINED. Click here for our new publications list, which is more up-to-date.


This page contains the titles and abstracts of papers written by author Gashler, Michael S., a member of the BYU Neural Networks and Machine Learning (NNML) Research Group. Postscript files are available for most papers. A more concise list is available.

To view the entire list in one page, click here.


Waffles: A Machine Learning Toolkit

  • Authors: Michael S. Gashler
  • Abstract: We present a breadth-oriented collection of cross-platform command-line tools for researchers in machine learning called Waffles. The Waffles tools are designed to offer a broad spectrum of functionality in a manner that is friendly for scripted automation. All functionality is also available in a C++ class library. Waffles is available under the GNU Lesser General Public License.
  • Reference: Journal of Machine Learning Research, volume MLOSS 12, pages 2383–2387, JMLR.org and Microtome Publishing, July 2011.
  • BibTeX
  • Download the file: pdf

Tangent Space Guided Intelligent Neighbor Finding

  • Authors: Michael S. Gashler and Tony Martinez
  • Abstract: We present an intelligent neighbor-finding algorithm called SAFFRON that chooses neighboring points while avoiding making connections between points on geodesically distant regions of a manifold. SAFFRON identifies the suitability of points to be neighbors by using a relaxation technique that alternately estimates the tangent space at each point, and measures how well the estimated tangent spaces align with each other. This technique enables SAFFRON to form high-quality local neighborhoods, even on manifolds that pass very close to themselves. SAFFRON is even able to find neighborhoods that correctly follow the manifold topology of certain self-intersecting manifolds.
  • Reference: In Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN’11, pages 2617–2624, IEEE Press, 2011.
  • BibTeX
  • Download the file: pdf

Temporal Nonlinear Dimensionality Reduction

  • Authors: Michael S. Gashler and Tony Martinez
  • Abstract: Existing Nonlinear dimensionality reduction (NLDR) algorithms make the assumption that distances between observations are uniformly scaled. Unfortunately, with many interesting systems, this assumption does not hold. We present a new technique called Temporal NLDR (TNLDR), which is specifically designed for analyzing the high-dimensional observations obtained from random-walks with dynamical systems that have external controls. It uses the additional information implicit in ordered sequences of observations to compensate for non-uniform scaling in observation space. We demonstrate that TNLDR computes more accurate estimates of intrinsic state than regular NLDR, and we show that accurate estimates of state can be used to train accurate models of dynamical systems.
  • Reference: In Proceedings of the IEEE International Joint Conference on Neural Networks IJCNN’11, pages 1959–1966, IEEE Press, 2011.
  • BibTeX
  • Download the file: pdf

Manifold Learning by Graduated Optimization

  • Authors: Michael S. Gashler and Dan Ventura and Tony Martinez
  • Abstract: We present an algorithm for manifold learning called Manifold Sculpting, which utilizes graduated optimization to seek an accurate manifold embedding. Empirical analysis across a wide range of manifold problems indicates that Manifold Sculpting yields more accurate results than a number of existing algorithms, including Isomap, LLE, HLLE, and L-MVU, and is significantly more efficient than HLLE and L-MVU. Manifold Sculpting also has the ability to benefit from prior knowledge about expected results.
  • Reference: IEEE Transactions on Systems, Man, and Cybernetics, Part B, volume PP (99), pages 1–13, 2011.
  • BibTeX
  • Download the file: pdf

Decision Tree Ensemble: Small Heterogeneous Is Better Than Large Homogeneous

  • Authors: Michael S. Gashler and Christophe Giraud-Carrier and Tony Martinez
  • Abstract: Using decision trees that split on randomly selected attributes is one way to increase the diversity within an ensemble of decision trees. Another approach increases diversity by combining multiple tree algorithms. The random forest approach has become popular because it is simple and yields good results with common datasets. We present a technique that combines heterogeneous tree algorithms and contrast it with homogeneous forest algorithms. Our results indicate that random forests do poorly when faced with irrelevant attributes, while our heterogeneous technique handles them robustly. Further, we show that large ensembles of random trees are more susceptible to diminishing returns than our technique. We are able to obtain better results across a large number of common datasets with a significantly smaller ensemble.
  • Reference: In Seventh International Conference on Machine Learning and Applications, 2008. ICMLA ’08., pages 900–905, Dec. 2008.
  • BibTeX
  • Download the file: pdf

Iterative Non-linear Dimensionality Reduction with Manifold Sculpting

  • Authors: Michael S. Gashler and Dan Ventura and Tony Martinez
  • Abstract: Many algorithms have been recently developed for reducing dimensionality by projecting data onto an intrinsic non-linear manifold. Unfortunately, existing algorithms often lose significant precision in this transformation. Manifold Sculpting is a new algorithm that iteratively reduces dimensionality by simulating surface tension in local neighborhoods. We present several experiments that show Manifold Sculpting yields more accurate results than existing algorithms with both generated and natural data-sets. Manifold Sculpting is also able to benefit from both prior dimensionality reduction efforts.
  • Reference: In Platt, J.C. and Koller, D. and Singer, Y. and Roweis, S., editor, Advances in Neural Information Processing Systems 20, pages 513–520, MIT Press, Cambridge, MA, 2008.
  • BibTeX
  • Download the file: pdf

Valid XHTML 1.0 Strict Valid CSS!