Ventura, Dan's Publications (detailed list)

THIS PAGE IS NO LONGER MAINTAINED. Click here for our new publications list, which is more up-to-date.


This page contains the titles and abstracts of papers written by author Ventura, Dan, a member of the BYU Neural Networks and Machine Learning (NNML) Research Group. Postscript files are available for most papers. A more concise list is available.

To view the entire list in one page, click here.


“A Sub-symbolic Model of the Cognitive Processes of Re-representation and Insight

  • Authors: Dan Ventura
  • Abstract: We present a sub-symbolic computational model for effecting knowledge re-representation and insight. Given a set of data, manifold learning is used to automatically organize the data into one or more representational transformations, which are then learned with a set of neural networks. The result is a set of neural filters that can be applied to new data as re-representation operators.
  • Reference: In Proceedings of ACM Creativity and Cognition, page to appear, October 2009.
  • BibTeX:
    @inproceedings{ventura.cc09,
    author = {Ventura, Dan},
    title = {“A Sub-symbolic Model of the Cognitive Processes of Re-representation and Insight},
    booktitle = {Proceedings of ACM Creativity and Cognition},
    pages = {to appear},
    month = {October},
    year = {2009},
    }
  • Download the file: pdf

A Reductio Ad Absurdum Experiment in Sufficiency for Evaluating (Computational) Creative Systems

  • Authors: Dan Ventura
  • Abstract: We consider a combination of two recent proposals for characterizing computational creativity and explore the sufficiency of the resultant framework. We do this in the form of a gedanken experiment designed to expose the nature of the framework, what it has to say about computational creativity, how it might be improved and what questions this raises.
  • Reference: In Proceedings of the International Joint Workshop on Computational Creativity, pages 11–19, September 2008.
  • BibTeX:
    @inproceedings{ventura.ijwcc08,
    author = {Ventura, Dan},
    title = {A Reductio Ad Absurdum Experiment in Sufficiency for Evaluating (Computational) Creative Systems},
    booktitle = {Proceedings of the International Joint Workshop on Computational Creativity},
    pages = {11--19},
    month = {September},
    year = {2008},
    }
  • Download the file: pdf

Data-Driven Programming and Behavior for Autonomous Virtual Characters

  • Authors: Jonathan Dinerstein and Dan Ventura and Michael Goodrich and Parris Egbert
  • Abstract: We present a high-level overview of a system for programming autonomous virtual characters by demonstration. The result is a deliberative model of agent behavior that is stylized and effective, as demonstrated in five different cases studies.
  • Reference: In Proceedings of the Association for the Advancement of Artificial Intelligence, pages 1450–1451, July 2008.
  • BibTeX:
    @inproceedings{dinerstein.aaai08,
    author = {Dinerstein, Jonathan and Ventura, Dan and Goodrich, Michael and Egbert, Parris},
    title = {Data-Driven Programming and Behavior for Autonomous Virtual Characters},
    booktitle = {Proceedings of the Association for the Advancement of Artificial Intelligence},
    pages = {1450--1451},
    month = {July},
    year = {2008},
    }
  • Download the file: pdf

Sub-symbolic Re-representation to Facilitate Learning Transfer

  • Authors: Dan Ventura
  • Abstract: We consider the issue of knowledge (re-)representation in the context of learning transfer and present a sub- symbolic approach for e?ecting such transfer. Given a set of data, manifold learning is used to automatically organize the data into one or more representational transformations, which are then learned with a set of neural networks. The result is a set of neural filters that can be applied to new data as re-representation operators. Encouraging preliminary empirical results elucidate the approach and demonstrate its feasibility, suggesting possible implications for the broader field of creativity.
  • Reference: In Creative Intelligent Systems, AAAI 2008 Spring Symposium Technical Report SS-08-03, pages 128–134, March 2008.
  • BibTeX:
    @inproceedings{ventura.aaaiss08,
    author = {Ventura, Dan},
    title = {Sub-symbolic Re-representation to Facilitate Learning Transfer},
    booktitle = {Creative Intelligent Systems, {AAAI} 2008 Spring Symposium Technical Report {SS}-08-03},
    pages = {128--134},
    month = {March},
    year = {2008},
    }

Robust Multi-Modal Biometric Fusion via SVM Ensemble

  • Authors: Sabra Dinerstein and Jon Dinerstein and Dan Ventura
  • Abstract: Existing learning-based multi-modal biometric fusion techniques typically employ a single static Support Vector Machine (SVM). This type of fusion improves the accuracy of biometric classification, but it also has serious limitations because it is based on the assumptions that the set of biometric classifiers to be fused is local, static, and complete. We present a novel multi-SVM approach to multi-modal biometric fusion that addresses the limitations of existing fusion techniques and show empirically that our approach retains good classification accuracy even when some of the biometric modalities are unavailable.
  • Reference: In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, pages 1530–1535, October 2007.
  • BibTeX:
    @inproceedings{dinerstein.smc07,
    author = {Dinerstein, Sabra and Dinerstein, Jon and Ventura, Dan},
    title = {Robust Multi-Modal Biometric Fusion via {SVM} Ensemble},
    booktitle = {Proceedings of the {IEEE} International Conference on Systems, Man and Cybernetics},
    pages = {1530--1535},
    month = {October},
    year = {2007},
    }
  • Download the file: pdf

Learning Policies for Embodied Virtual Agents Through Demonstration

  • Authors: Jonathan Dinerstein and Parris Egbert and Dan Ventura
  • Abstract: Although many powerful AI and machine learning techniques exist, it remains difficult to quickly create AI for embodied virtual agents that produces visually lifelike behavior. This is important for applications (e.g., games, simulators, interactive displays) where an agent must behave in a manner that appears human-like. We present a novel technique for learning reactive policies that mimic demonstrated human behavior. The user demonstrates the desired behavior by dictating the agent’s actions during an interactive animation. Later, when the agent is to behave autonomously, the recorded data is generalized to form a continuous state-to-action mapping. Combined with an appropriate animation algorithm (e.g., motion capture), the learned policies realize stylized and natural-looking agent behavior. We empirically demonstrate the efficacy of our technique for quickly producing policies which result in lifelike virtual agent behavior.
  • Reference: In Proceedings of the International Joint Conference on Artificial Intelligence, pages 1257–1262, Hyderabad, India, January 2007.
  • BibTeX:
    @inproceedings{dinerstein.ijcai07,
    author = {Dinerstein, Jonathan and Egbert, Parris and Ventura, Dan},
    title = {Learning Policies for Embodied Virtual Agents Through Demonstration},
    booktitle = {Proceedings of the International Joint Conference on Artificial Intelligence},
    pages = {1257--1262},
    address = {Hyderabad, India},
    month = {January},
    year = {2007},
    }
  • Download the file: pdf

Clustering Music via the Temporal Similarity of Timbre

  • Authors: Jake Merrell and Dan Ventura and Bryan Morse
  • Abstract: We consider the problem of measuring the similarity of streaming music content and present a method for modeling, on the fly, the temporal progression of a song’s timbre. Using a minimum distance classification scheme, we give an approach to classifying streaming music sources and present performance results for auto- associative song identification and for content-based clustering of streaming music. We discuss possible extensions to the approach and possible uses for such a system.
  • Reference: In IJCAI Workshop on Artificial Intelligence and Music, pages 153–164, January 2007.
  • BibTeX:
    @inproceedings{merrell.ijcai07,
    author = {Merrell, Jake and Ventura, Dan and Morse, Bryan},
    title = {Clustering Music via the Temporal Similarity of Timbre},
    booktitle = {{IJCAI} Workshop on Artificial Intelligence and Music},
    pages = {153--164},
    month = {January},
    year = {2007},
    }
  • Download the file: pdf

Geometric Task Decomposition in a Multi-agent Environment

  • Authors: Kaivan Kamali and Dan Ventura and Amulya Garga and Soundar Kumara
  • Abstract: Task decomposition in a multi-agent environment is often performed online. This paper proposes a method for sub-task allocation that can be performed before the agents are deployed, reducing the need for communication among agents during their mission. The proposed method uses a Voronoi diagram to partition the task-space among team members and includes two phases: static and dynamic. Static decomposition (performed in simulation before the start of the mission) repeatedly partitions the task-space by generating random diagrams and measuring the efficacy of the corresponding sub-task allocation. If necessary, dynamic decomposition (performed in simulation after the start of a mission) modifies the result of a static decomposition (i.e. in case of resource limitations for some agents). Empirical results are reported for the problem of surveillance of an arbitrary region by a team of agents.
  • Reference: Applied Artificial Intelligence, volume 20 (5), pages 437–456, 2006.
  • BibTeX:
    @article{Kamali.aai06,
    author = {Kamali, Kaivan and Ventura, Dan and Garga, Amulya and Kumara, Soundar},
    title = {Geometric Task Decomposition in a Multi-agent Environment},
    journal = {Applied Artificial Intelligence},
    volume = {20},
    number = {5},
    pages = {437--456},
    year = {2006},
    }
  • Download the file: pdf

Fast and Robust Incremental Action Prediction for Interactive Agents

  • Authors: John Dinerstein and Dan Ventura and Parris Egbert
  • Abstract: The ability for a given agent to adapt on-line to better interact with another agent is a difficult and important problem. This problem becomes even more difficult when the agent to interact with is a human, since humans learn quickly and behave nondeterministically. In this paper we present a novel method whereby an agent can incrementally learn to predict the actions of another agent (even a human), and thereby can learn to better interact with that agent. We take a case-based approach, where the behavior of the other agent is learned in the form of state-action pairs. We generalize these cases either through continuous k-nearest neighbor, or a modified bounded minimax search. Through our case studies, our technique is empirically shown to require little storage, learn very quickly, and be fast and robust in practice. It can accurately predict actions several steps into the future. Our case studies include interactive virtual environments involving mixtures of synthetic agents and humans, with cooperative and/or competitive relationships.
  • Reference: Computational Intelligence, volume 21 (1), pages 90–110, 2005.
  • BibTeX:
    @article{dinerstein.ci05,
    author = {Dinerstein, John and Ventura, Dan and Egbert, Parris},
    title = {Fast and Robust Incremental Action Prediction for Interactive Agents},
    journal = {Computational Intelligence},
    volume = {21},
    number = {1},
    pages = {90--110},
    year = {2005},
    }
  • Download the file: pdf

Training a Quantum Neural Network

  • Authors: Bob Ricks and Dan Ventura
  • Abstract: Quantum learning holds great promise for the field of machine intelligence. The most studied quantum learning algorithm is the quantum neural network. Many such models have been proposed, yet none has become a standard. In addition, these models usually leave out many details, often excluding how they intend to train their networks. This paper discusses one approach to the problem and what advantages it would have over classical networks.
  • Reference: In Neural Information Processing Systems, pages 1019–1026, December 2003.
  • BibTeX:
    @inproceedings{ricks.nips03,
    author = {Ricks, Bob and Ventura, Dan},
    title = {Training a Quantum Neural Network},
    booktitle = {Neural Information Processing Systems},
    pages = {1019--1026},
    month = {December},
    year = {2003},
    }
  • Download the file: pdf

Probabilistic Connections in Relaxation Networks

  • Authors: Dan Ventura
  • Abstract: This paper reports results from studying the behavior of Hopfield-type networks with probabilistic connections. As the probabilities decrease, network performance degrades. In order to compensate, two network modifications — input persistence and a new activation function — are suggested, and empirical results indicate that the modifications significantly improve network performance.
  • Reference: In Proceedings of the International Joint Conference on Neural Networks, pages 934–938, May 2002.
  • BibTeX:
    @inproceedings{ventura.ijcnn02,
    author = {Ventura, Dan},
    title = {Probabilistic Connections in Relaxation Networks},
    booktitle = {Proceedings of the International Joint Conference on Neural Networks},
    pages = {934--938},
    month = {May},
    year = {2002},
    }
  • Download the file: pdf

Pattern Classification Using a Quantum System

  • Authors: Dan Ventura
  • Abstract: We consider and compare three approaches to quantum pattern classification, presenting empirical results from simulations.
  • Reference: In Proceedings of the Joint Conference on Information Sciences, pages 537–640, March 2002.
  • BibTeX:
    @inproceedings{ventura.jcis02,
    author = {Ventura, Dan},
    title = {Pattern Classification Using a Quantum System},
    booktitle = {Proceedings of the Joint Conference on Information Sciences},
    pages = {537--640},
    month = {March},
    year = {2002},
    }
  • Download the file: pdf

A Quantum Analog to Basis Function Networks

  • Authors: Dan Ventura
  • Abstract: A Fourier-based quantum computational learning algorithm with similarities to classical basis function networks is developed. Instead of a Gaussian basis, the quantum algorithm uses a discrete Fourier basis with the output being a linear combination of the basis. A set of examples is considered as a quantum system that undergoes unitary transformations to produce learning. The main result of the work is a quantum computational learning algorithm that is unique among quantum algorithms as it does not assume a priori knowledge of a function f.
  • Reference: In Proceedings of the International Conference on Computing Anticipatory Systems, pages 286–295, August 2001.
  • BibTeX:
    @inproceedings{ventura.casys01,
    author = {Ventura, Dan},
    title = {A Quantum Analog to Basis Function Networks},
    booktitle = {Proceedings of the International Conference on Computing Anticipatory Systems},
    pages = {286--295},
    month = {August},
    year = {2001},
    }
  • Download the file: pdf

On the Utility of Entanglement in Quantum Neural Computing

  • Authors: Dan Ventura
  • Abstract: Efforts in combining quantum and neural computation are briefly discussed and the concept of entanglement as it applies to this subject is addressed. Entanglement is perhaps the least understood aspect of quantum systems used for computation, yet it is apparently most responsible for their computational power. This paper argues for the importance of understanding and utilizing entanglement in quantum neural computation.
  • Reference: In Proceedings of the International Joint Conference on Neural Networks, pages 1565–1570, July 2001.
  • BibTeX:
    @inproceedings{ventura.ijcnn01,
    author = {Ventura, Dan},
    title = {On the Utility of Entanglement in Quantum Neural Computing},
    booktitle = {Proceedings of the International Joint Conference on Neural Networks},
    pages = {1565--1570},
    month = {July},
    year = {2001},
    }
  • Download the file: pdf

Learning Quantum Operators

  • Authors: Dan Ventura
  • Abstract: Consider the system F|x>=|y> where F is unknown. We examine the possibility of learning the operator F inductively, drawing analogies with ideas from classical computational learning.
  • Reference: In Proceedings of the Joint Conference on Information Sciences, pages 750–752, March 2000.
  • BibTeX:
    @inproceedings{ventura.jcis00,
    author = {Ventura, Dan},
    title = {Learning Quantum Operators},
    booktitle = {Proceedings of the Joint Conference on Information Sciences},
    pages = {750--752},
    month = {March},
    year = {2000},
    }
  • Download the file: pdf

Quantum Neural Networks

  • Authors: Alexandr Ezhov and Dan Ventura
  • Abstract: This chapter outlines the research, development and perspectives of quantum neural networks – a burgeoning new field which integrates classical neurocomputing with quantum computation. It is argued that the study of quantum neural networks may give us both new undestanding of brain function as well as unprecedented possibilities in creating new systems for information processing, including solving classically intractable problems, associative memory with exponential capacity and possibly overcoming the limitations posed by the Church-Turing thesis.
  • Reference: Kasabov, N., editor, Future Directions for Intelligent Systems and Information Science 2000, Physica-Verlag, 2000.
  • BibTeX:
    @article{ezhov.fdisis00,
    author = {Ezhov, Alexandr and Ventura, Dan},
    title = {Quantum Neural Networks},
    editor = {Kasabov, N.},
    journal = {Future Directions for Intelligent Systems and Information Science 2000},
    address = {Physica-Verlag},
    year = {2000},
    }
  • Download the file: pdf

Distributed Queries for Quantum Associative Memory

  • Authors: Alexandr Ezhov and A. Nifanova and Dan Ventura
  • Abstract: This paper discusses a model of quantum associative memory which generalizes the completing associative memory proposed by Ventura and Martinez. Similar to this model, our system is based on Grover’s well known algorithm for searching an unsorted quantum database. However, the model presented in this paper suggests the use of a distributed query of general form. It is demonstrated that spurious memories form an unavoidable part of the quantum associative memory model; however, the very presence of these spurious states provides the possibility of organizing a controlled process of data retrieval using a specially formed initial state of the quantum database and also of the transformation performed upon it. Concrete examples illustrating the properties of the proposed model are also presented.
  • Reference: Information Sciences , volume 3-4, pages 271–293, 2000.
  • BibTeX:
    @article{ezhov.is00,
    author = {Ezhov, Alexandr and Nifanova, A. and Ventura, Dan},
    title = {Distributed Queries for Quantum Associative Memory},
    journal = {Information Sciences },
    volume = {3-4},
    pages = {271--293},
    year = {2000},
    }
  • Download the file: pdf

Optically Simulating a Quantum Associative Memory

  • Authors: John Howell and John Yeazell and Dan Ventura
  • Abstract: This paper discusses the realization of a quantum associative memory using linear integrated optics. An associative memory produces a full pattern of bits when presented with only a partial pattern. Quantum computers have the potential to store large numbers of patterns and hence have the ability to far surpass any classical neural network realization of an associative memory. In this work two 3-qubit associative memories will be discussed using linear integrated optics. In addition, corrupted, invented and degenerate memories are discussed.
  • Reference: Physical Review A, volume 62, 2000. Article 42303.
  • BibTeX:
    @article{howell.pra00,
    author = {Howell, John and Yeazell, John and Ventura, Dan},
    title = {Optically Simulating a Quantum Associative Memory},
    journal = {Physical Review A},
    volume = {62},
    year = {2000},
    note = {Article 42303},
    }
  • Download the file: pdf

Quantum Associative Memory

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: This paper combines quantum computation with classical neural network theory to produce a quantum computational learning algorithm. Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum associative memory with a capacity exponential in the number of neurons. This paper combines two quantum computational algorithms to produce such a quantum associative memory. The result is an exponential increase in the capacity of the memory when compared to traditional associative memories such as the Hopfield network. The paper covers necessary high-level quantum mechanical and quantum computational ideas and introduces a quantum associative memory. Theoretical analysis proves the utility of the memory, and it is noted that a small version should be physically realizable in the near future.
  • Reference: Information Sciences , volume 1-4, pages 273–296, 2000.
  • BibTeX:
    @article{ventura.is00,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Quantum Associative Memory},
    journal = {Information Sciences },
    volume = {1-4},
    pages = {273--296},
    year = {2000},
    }
  • Download the file: pdf

Initializing the Amplitude Distribution of a Quantum State

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: To date, quantum computational algorithms have operated on a superposition of all basis states of a quantum system. Typically, this is because it is assumed that some function f is known and implementable as a unitary evolution. However, what if only some points of the function f are known? It then becomes important to be able to encode only the knowledge that we have about f. This paper presents an algorithm that requires a polynomial number of elementary operations for initializing a quantum system to represent only the m known points of a function f.
  • Reference: Foundations of Physics Letters , volume 6, pages 547–559, December 1999.
  • BibTeX:
    @article{ventura.fopl99,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Initializing the Amplitude Distribution of a Quantum State},
    journal = {Foundations of Physics Letters },
    volume = {6},
    pages = {547--559},
    month = {December},
    year = {1999},
    }
  • Download the file: pdf

A Quantum Associative Memory Based on Grover’s Algorithm

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. The unique characteristics of quantum theory may also be used to create a quantum associative memory with a capacity exponential in the number of neurons. This paper combines two quantum computational algorithms to produce a quantum associative memory. The result is an exponential increase in the capacity of the memory when compared to traditional associative memories such as the Hopfield network. This paper covers necessary high- level quantum mechanical ideas and introduces a quantum associative memory, a small verson of which should be physically realizable in the near future.
  • Reference: In Proceedings of the International Conference on Artificial Neural Networks and Genetic Algorithms, pages 22–27, April 1999.
  • BibTeX:
    @inproceedings{ventura.icannga99,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {A Quantum Associative Memory Based on Grover's Algorithm},
    booktitle = {Proceedings of the International Conference on Artificial Neural Networks and Genetic Algorithms},
    pages = {22--27},
    month = {April},
    year = {1999},
    }
  • Download the file: ps, pdf

Quantum Computational Intelligence: Answers and Questions

  • Authors: Dan Ventura
  • Abstract: This is a brief article discussing the interesting possibilities and potential difficulties with combining classical computational intelligence with quantum computation. See http://www.computer.org/intelligent/ex1999/pdf/x4009.pdf for a copy of the article.
  • Reference: IEEE Intelligent Systems , volume 4, pages 14–16, 1999.
  • BibTeX:
    @article{ventura.ieeeis99,
    author = {Ventura, Dan},
    title = {Quantum Computational Intelligence: Answers and Questions},
    journal = {{IEEE} Intelligent Systems },
    volume = {4},
    pages = {14--16},
    year = {1999},
    }
  • Download the file: pdf

Implementing Competitive Learning in a Quantum System

  • Authors: Dan Ventura
  • Abstract: Ideas from quantum computation are applied to the field of neural networks to produce competitive learning in a quantum system. The resulting quantum competitive learner has a prototype storage capacity that is exponentially greater than that of its classical counterpart. Further, empirical results from simulation of the quantum competitive learning system on real-world data sets demonstrate the quantum system’s potential for excellent performance.
  • Reference: In Proceedings of the International Joint Conference on Neural Networks (IJCNN’99), paper 513, 1999.
  • BibTeX:
    @inproceedings{ventura.ijcnn99a,
    author = {Ventura, Dan},
    title = {Implementing Competitive Learning in a Quantum System},
    booktitle = {Proceedings of the International Joint Conference on Neural Networks ({IJCNN}'99), paper 513},
    year = {1999},
    }
  • Download the file: ps, pdf

A Neural Model of Centered Tri-gram Speech Recognition

  • Authors: Dan Ventura and D. Randall Wilson and Brian Moncur and Tony R. Martinez
  • Abstract: A relaxation network model that includes higher order weight connections is introduced. To demonstrate its utility, the model is applied to the speech recognition domain. Traditional speech recognition systems typically consider only that context preceding the word to be recognized. However, intuition suggests that considering following context as well as preceding context should improve recognition accuracy. The work described here tests this hypothesis by applying the higher order relaxation network to consider both precedes and follows context in a speech recognition task. The results demonstrate both the general utility of the higher order relaxation network as well as its improvement over traditional methods on a speech recognition task.
  • Reference: In Proceedings of the International Joint Conference on Neural Networks (IJCNN’99), paper 2188, 1999.
  • BibTeX:
    @inproceedings{ventura.ijcnn99b,
    author = {Ventura, Dan and Wilson, D. Randall and Moncur, Brian and Martinez, Tony R.},
    title = {A Neural Model of Centered Tri-gram Speech Recognition},
    booktitle = {Proceedings of the International Joint Conference on Neural Networks ({IJCNN}'99), paper 2188},
    year = {1999},
    }
  • Download the file: pdf, ps

Artificial Associative Memory using Quantum Processes

  • Authors: Dan Ventura
  • Abstract: This paper discusses an approach to constructing an artificial quantum associative memory (QuAM). The QuAM makes use of two quantum computational algorithms, one for pattern storage and the other for pattern recall. The result is an exponential increase in the capacity of the memory when compared to traditional associative memories such as the Hopfield network. Further, the paper argues for considering pattern recall as a non-unitary process and demonstrates the utility of non-unitary operators for improving the pattern recall performance of the QuAM.
  • Reference: In Proceedings of the Joint Conference on Information Sciences, volume 2, pages 218–221, October 1998.
  • BibTeX:
    @inproceedings{ventura.jcis98,
    author = {Ventura, Dan},
    title = {Artificial Associative Memory using Quantum Processes},
    booktitle = {Proceedings of the Joint Conference on Information Sciences},
    volume = {2},
    pages = {218--221},
    month = {October},
    year = {1998},
    }
  • Download the file: ps, pdf

Quantum and Evolutionary Approaches to Computational Learning

  • Authors: Dan Ventura
  • Abstract: This dissertation presents two methods for attacking the problem of high dimensional spaces inherent in most computational learning problems. The first approach is a hybrid system for combining the thorough search capabilities of evolutionary computation with the speed and generalization of neural computation. This neural/evolutionary hybrid is utilized in three different settings: to address the problem of data acquisition for training a supervised learning system; as a learning optimization system; and as a system for developing neurocontrol. The second approach is the idea of quantum computational learning that overcomes the “curse of dimensionality” by taking advantage of the massive state space of quantum systems to process information in a way that is classically impossible. The quantum computational learning approach results in the development of a neuron with quantum mechanical properties, a quantum associative memory and a quantum computational learning system for inductive learning.
  • Reference: PhD thesis, Brigham Young University, Computer Science Department, August 1998.
  • BibTeX:
    @phdthesis{ventura.dissertation,
    author = {Ventura, Dan},
    title = {Quantum and Evolutionary Approaches to Computational Learning},
    school = {Brigham Young University},
    address = {Computer Science Department},
    month = {August},
    year = {1998},
    }
  • Download the file: ps

Quantum Associative Memory with Exponential Capacity

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts by taking advantage of quantum parallelism. The unique characteristics of quantum theory may also be used to create a quantum associative memory with a capacity exponential in the number of neurons. This paper covers necessary high-level quantum mechanical ideas and introduces a simple quantum associative memory. Further, it provides discussion, empirical results and directions for future work.
  • Reference: In Proceedings of the International Joint Conference on Neural Networks, pages 509–13, May 1998.
  • BibTeX:
    @inproceedings{ventura.ijcnn98a,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Quantum Associative Memory with Exponential Capacity},
    booktitle = {Proceedings of the International Joint Conference on Neural Networks},
    pages = {509--13},
    month = {May},
    year = {1998},
    }
  • Download the file: ps, pdf

Optimal Control Using a Neural/Evolutionary Hybrid System

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: One of the biggest hurdles to developing neurocontrollers is the difficulty in establishing good training data for the neural network. We propose a hybrid approach to the development of neurocontrollers that employs both evolutionary computation (EC) and neural networks (NN). EC is used to discover appropriate control actions for specific plant states. The survivors of the evolutionary process are used to construct a training set for the NN. The NN learns the training set, is able to generalize to new plant states, and is then used for neurocontrol. Thus the EC/NN approach combines the broad, parallel search of EC with the rapid execution and generalization of NN to produce a viable solution to the control problem. This paper presents the EC/NN hybrid and demonstrates its utility in developing a neurocontroller that demonstrates stability, generalization, and optimality.
  • Reference: In Proceedings of the International Joint Conference on Neural Networks, pages 1036–41, May 1998.
  • BibTeX:
    @inproceedings{ventura.ijcnn98b,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Optimal Control Using a Neural/Evolutionary Hybrid System},
    booktitle = {Proceedings of the International Joint Conference on Neural Networks},
    pages = {1036--41},
    month = {May},
    year = {1998},
    }
  • Download the file: pdf, ps

Using Evolutionary Computation to Facilitate Development of Neurocontrol

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: The field of neurocontrol, in which neural networks are used for control of complex systems, has many potential applications. One of the biggest hurdles to developing neurocontrollers is the difficulty in establishing good training data for the neural network. We propose a hybrid approach to the development of neurocontrollers that employs both evolutionary computation (EC) and neural networks (NN). The survivors of this evolutionary process are used to construct a training set for the NN. The NN learns the training set, is able to generalize to new system states, and is then used for neurocontrol. Thus the EC/NN approach combines the broad, parallel search of EC with the rapid execution and generalization of NN to produce a viable solution to the control problem. This paper presents the EC/NN hybrid and demonstrates its utility in developing a neurocontroller for the pole balancing problem.
  • Reference: In Proceedings of the International Workshop on Neural Networks and Neurocontrol, August 1997.
  • BibTeX:
    @inproceedings{ventura.sian97,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Using Evolutionary Computation to Facilitate Development of Neurocontrol},
    booktitle = {Proceedings of the International Workshop on Neural Networks and Neurocontrol},
    month = {August},
    year = {1997},
    }
  • Download the file: pdf, ps

An Artificial Neuron with Quantum Mechanical Properties

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: Quantum computation uses microscopic quantum level effects to perform computational tasks and has produced results that in some cases are exponentially faster than their classical counterparts. Choosing the best weights for a neural network is a time consuming problem that makes the harnessing of this quantum parallelism appealing. This paper briefly covers high-level quantum theory and introduces a model for a quantum neuron.
  • Reference: In Proceedings of the International Conference on Neural Networks and Genetic Algorithms, pages 482–485, 1997.
  • BibTeX:
    @inproceedings{ventura.icannga97,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {An Artificial Neuron with Quantum Mechanical Properties},
    booktitle = {Proceedings of the International Conference on Neural Networks and Genetic Algorithms},
    pages = {482--485},
    year = {1997},
    }
  • Download the file: pdf, ps

Concerning a General Framework for the Development of Intelligent Systems

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: There exists on-going debate between Connectionism and Symbolism as to the nature of and approaches to cognition. Many viewpoints exist and various issues seen as important have been raised. This paper suggests that a combination of these methodologies will lead to a better overall model. The paper reviews and assimilates the opinions and viewpoints of these diverse fields and provides a cohesive list of issues thought to be critical to the modeling of intelligence. Further, this list results in a framework for the development of a general, unified theory of cognition.
  • Reference: In Proceedings of the IASTED International Conference on Artificial Intelligence, Expert Systems and Neural Networks, pages 44–47, 1996.
  • BibTeX:
    @inproceedings{ventura.iasted96,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Concerning a General Framework for the Development of Intelligent Systems},
    booktitle = {Proceedings of the {IASTED} International Conference on Artificial Intelligence, Expert Systems and Neural Networks},
    pages = {44--47},
    year = {1996},
    }
  • Download the file: pdf, ps

Robust Optimization Using Training Set Evolution

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: Training Set Evolution is an eclectic optimization technique that combines evolutionary computation (EC) with neural networks (NN). The synthesis of EC with NN provides both initial unsupervised random exploration of the solution space as well as supervised generalization on those initial solutions. An assimilation of a large amount of data obtained over many simulations provides encouraging empirical evidence for the robustness of Evolutionary Training Sets as an optimization technique for feedback and control problems.
  • Reference: In Proceedings of the International Conference on Neural Networks, pages 524–8, 1996.
  • BibTeX:
    @inproceedings{ventura.icnn96,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Robust Optimization Using Training Set Evolution},
    booktitle = {Proceedings of the International Conference on Neural Networks},
    pages = {524--8},
    year = {1996},
    }
  • Download the file: ps, pdf

A General Evolutionary/Neural Hybrid Approach to Learning Optimization Problems

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: A method combining the parallel search capabilities of Evolutionary Computation (EC) with the generalization of Neural Networks (NN) for solving learning optimization problems is presented. Assuming a fitness function for potential solutions can be found, EC can be used to explore the solution space, and the survivors of the evolution can be used as a training set for the NN which then generalizes over the entire space. Because the training set is generated by EC using a fitness function, this hybrid approach allows explicit control of training set quality.
  • Reference: In Proceedings of the World Congress on Neural Networks, pages 1091–5, 1996.
  • BibTeX:
    @inproceedings{ventura.wcnn96,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {A General Evolutionary/Neural Hybrid Approach to Learning Optimization Problems},
    booktitle = {Proceedings of the World Congress on Neural Networks},
    pages = {1091--5},
    year = {1996},
    }
  • Download the file: pdf, ps

On Discretization as a Preprocessing Step For Supervised Learning Models

  • Authors: Dan Ventura
  • Abstract: Many machine learning and neurally inspired algorithms are limited, at least in their pure form, to working with nominal data. However, for many real-world problems, some provision must be made to support processing of continuously valued data. BRACE, a paradigm for the discretization of continuously valued attributes is introduced, and two algorithmic instantiations of this paradigm, VALLEY and SLICE are presented. These methods are compared empirically with other discretization techniques on several real-world problems and no algorithm clearly outperforms the others. Also, discretization as a preprocessing step is in many cases found to be inferior to direct handling of continuously valued data. These results suggest that machine learning algorithms should be designed to directly handle continuously valued data rather than relying on preprocessing or ad hoc techniques. To this end statistical prototypes (SP/MSP) are developed and an empirical comparison with well-known learning algorithms is presented. Encouraging results demonstrate that statistical prototypes have the potential to handle continuously valued data well. However, at this point, they are not suited for handling nominally valued data, which is arguably at least as important as continuously valued data in learning real-world applications. Several areas of ongoing research that aim to provide this ability are presented.
  • Reference: Master’s thesis, Brigham Young University, Computer Science Department, April 1995.
  • BibTeX:
    @mastersthesis{ventura.thesis,
    author = {Ventura, Dan},
    title = {On Discretization as a Preprocessing Step For Supervised Learning Models},
    school = {Brigham Young University},
    address = {Computer Science Department},
    month = {April},
    year = {1995},
    }
  • Download the file: ps

Using Evolutionary Computation to Generate Training Set Data for Neural Networks

  • Authors: Dan Ventura and Tim L. Andersen and Tony R. Martinez
  • Abstract: Most neural networks require a set of training examples in order to attempt to approximate a problem function. For many real-world problems, however, such a set of examples is unavailable. Such a problem involving feedback optimization of a computer network routing system has motivated a general method of generating artificial training sets using evolutionary computation. This paper describes the method and demonstrates its utility by presenting promising results from applying it to an artificial problem similar to a real-world network routing optimization problem.
  • Reference: In Proceedings of the International Conference on Artificial Neural Networks and Genetic Algorithms, pages 468–471, 1995.
  • BibTeX:
    @inproceedings{ventura.icannga95,
    author = {Ventura, Dan and Andersen, Tim L. and Martinez, Tony R.},
    title = {Using Evolutionary Computation to Generate Training Set Data for Neural Networks},
    booktitle = {Proceedings of the International Conference on Artificial Neural Networks and Genetic Algorithms},
    pages = {468--471},
    year = {1995},
    }
  • Download the file: pdf, ps

An Empirical Comparison of Discretization Models

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: Many machine learning and neurally inspired algorithms are limited, at least in their pure form, to working with nominal data. However, for many real-world problems, some provision must be made to support processing of continuously valued data. This paper presents empirical results obtained by using six different discretization methods as preprocessors to three different supervised learners on several real-world problems. No discretization technique clearly outperforms the others. Also, discretization as a preprocessing step is in many cases found to be inferior to direct handling of continuously valued data. These results suggest that machine learning algorithms should be designed to directly handle continuously valued data rather than relying on preprocessing or ad hoc techniques.
  • Reference: In Proceedings of the 10th International Symposium on Computer and Information Sciences, pages 443–450, 1995.
  • BibTeX:
    @inproceedings{ventura.iscis95,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {An Empirical Comparison of Discretization Models},
    booktitle = {Proceedings of the 10th International Symposium on Computer and Information Sciences},
    pages = {443--450},
    year = {1995},
    }
  • Download the file: ps, pdf

Using Multiple Statistical Prototypes to Classify Continuously Valued Data

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: Multiple Statistical Prototypes (MSP) is a modification of a standard minimum distance classification scheme that generates multiple prototypes per class using a modified greedy heuristic. Empirical comparison of MSP with other well-known learning algorithms shows MSP to be a robust algorithm that uses a very simple premise to produce good generalization and achieve parsimonious hypothesis representation.
  • Reference: In Proceedings of the International Symposium on Neuroinformatics and Neurocomputers, pages 238–245, 1995.
  • BibTeX:
    @inproceedings{ventura.isninc95,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {Using Multiple Statistical Prototypes to Classify Continuously Valued Data},
    booktitle = {Proceedings of the International Symposium on Neuroinformatics and Neurocomputers},
    pages = {238--245},
    year = {1995},
    }
  • Download the file: ps, pdf

BRACE: A Paradigm for the Discretization of Continuously Valued Data

  • Authors: Dan Ventura and Tony R. Martinez
  • Abstract: Discretization of continuously valued data is a useful and necessary tool because many learning paradigms assume nominal data. A list of objectives for efficient and effective discretization is presented. A paradigm called BRACE (Boundary Ranking And Classification Evaluation) that attempts to meet the objectives is presented along with an algorithm that follows the paradigm. The paradigm meets many of the objectives, with potential for extension to meet the remainder. Empirical results have been promising. For these reasons BRACE has potential as an effective and efficient method for discretization of continuously valued data. A further advantage of BRACE is that it is general enough to be extended to other types of clustering/unsupervised learning.
  • Reference: In Proceedings of the Seventh Florida Artificial Intelligence Research Symposium, pages 117–121, 1994.
  • BibTeX:
    @inproceedings{ventura.flairs94,
    author = {Ventura, Dan and Martinez, Tony R.},
    title = {{BRACE}: A Paradigm for the Discretization of Continuously Valued Data},
    booktitle = {Proceedings of the Seventh Florida Artificial Intelligence Research Symposium},
    pages = {117--121},
    year = {1994},
    }
  • Download the file: pdf, ps

Valid XHTML 1.0 Strict Valid CSS!