College Projects


Knowledge Encoding and Retrieval Tool (KERT)

My masters degree in Computer Science at Michigan State University culminated in the first half of 1995. During this time I worked on my masters project, the Knowledge Encoding and Retrieval Tool (KERT), under the faculty direction of Dr. Carrie Heeter, Director of the Communication Technology Lab, and Dr. William Punch and Dr. Don Wienshank, both professors in the Computer Science Department.

KERT was derived from multidisciplinary research in the areas of Computer Science, Education, Cognitive Science, and Hypermedia. It is an attempt to resolve many of the deficiencies found in hypermedia systems. KERT is composed of an underlying hypermedia structure coupled with a comprehensive set of navigational and learning tools to more fully realize the power of hypermedia for thinking and learning. It is both a multimedia developers tool and an end-user environment.

Many of the concepts and tools constructed during this masters project, have been adapted and applied to various commercial products developed in the Communication Technology Lab, including Discovering Our Environment: A Case Study in the Virgin Islands and the American Identity Explorer.

The complete project write-up is available online.

The Case of the Salami Slicer

The course that changed my academic direction (and probably my life) took place in the Fall of 1994, Hypermedia Design (TC 466 with Carrie Heeter). The course was a blend of theory and practical hands-on experience, covering topics in user-interface design, multimedia authoring, and CD-ROM and Web publishing. Previous to this, I had been quite interested in multimedia, dabbling with Hypercard and Director. But it was this course that solidified my interests. At the time, the technical side of the course was based around Hypercard, but I went on my own and taught myself Director and did all the class projects in Director. The most notable of the class projects was an in-depth self-portrait. Rather than make the project an information based browser of facts about myself, I decided to frame it in the setting of a game. The interactive game combines full-color animation, digital video, and sound.

The setup: "You are a rookie in the competitive field of investigative reporting for the MSU State News. All that you need is just one big story and you will be famous throughout MSU. Unfortunately, you keep getting assigned to lame stories to investigate and today seems like no exception..."

In the game, you must investigate Brian Winn, a local computer hacker who is believed to have broken into the MSU Credit Union and "salami sliced" off the half cents from each account and forwarded into his own account. To do this, you must explore the FBI files on Brian through your computer, call and interrogate his friends via your video phone, and search his video library for any possible clues. When you have completed the investigation, you must publish the story in the MSU State News. But you have to be careful, if you report a false story, you may get sued...

Vehicle Navigation System

In the Fall of 1994, I took the graduate level Advanced Databases (CPS 880 with Sakti Pramanik) at Michigan State University. As a final project in the course, my partner, Ryan McFall, and I created a prototype Vehicle Navigation System.

The goal of the project was to design a query language suitable for a real time automotive navigation system. The language, dubbed Vehicle Navigation Query Language (VNQL), interfaced with a HiTi-Graph model Sybase database for road map information and a pseudo-global positioning system (GPS) for vehicle location. The language allowed several types of navigational and information queries a user could perform while driving. The project also investigated the human interface details that an on-line, real time navigation system would require. A prototype interface was constructed, given realistic hardware and cost constraints.

It is my understanding that further research and development has been done on the project during later offers of the course.


In the summer of 1994, my friend, Scott Connell, and I took an independent study course (CPS 890 with Jon Sticklen) to learn NewtonScript programming.

To learn the language, we designed various applications for the Newton. The most notable of them was LlamaCalc, a powerful calculator with many features, include complete math, scientific, and financial operations, programmability, and graphing capabilities.

A Rule-Based System for an Intelligent Auto-Pilot

My strong interests coming into graduate school during the Fall of 1993 were in Artificial Intelligence. One of the first courses I took was graduate level Artificial Intelligence (CPS 841 with Bill Punch). For the final project in the course, I created a rule-based system that acts as an "autopilot" to fly multiple planes through an airspace to their destination while avoiding various obstacles and aircraft, while not violating any of the FAA rules of the airspace.

The tool created was a solid forward-chaining rule based system with variable bindings. Conflict resolution was based on rule-ordering. The tool was generalizeable enough to be applied to other domains.

The flight navigation system domain had on the order of 200 rules to drive multiple aircraft under fairly realistic conditions. The airspace was defined by a user-provided map descriptor file, which contained a bump map terrain, and various landmarks such as cities, airports, radio towers, and military testing areas. The aircraft flight plan was defined by the user through a series of flight plan descriptor files for each aircraft, which contained the aircraft's starting point and its desired final destination. The final destination could be an airport or an attitude, direction, and location to leave the given airspace. Information about aircraft avoidance, near and far miss situations, restricted zones and other information was tracked and reported.

Evolutionary Construction of Neural Networks

I had worked with neural networks in the QSAR project (below), but had not received any formal instruction on the subject. So, during the Fall of 1993 I took Advanced Neural Networks (CPS 885 with Anil Jain). For the final project, my partner, Dave Pettigrew, and I created a system that would evolve neural network topologies using genetic algorithms.

With the system, a user could provide the training and testing data. The system would then randomly create a population of neural networks. These networks would then be trained, using the backpropogation algorithm, and tested to assess their "fitness."

The system would then make use of a genetic algorithm to evolve the neural network topologies to a next general population of neural networks. The process of training, testing, and evolving would continue for several generations until an optimal network topology was found.

Additional information available online.

Analogical Reasoning Through Neural Networks: A Tool for Quantitative Structure-Activity Relationship (QSAR) Analysis

During the Spring of 1993 I partook in the Undergraduate Research Opportunities Program (UROP). In this program, I worked with Dr. Timothy Colburn, a professor in Computer Science at the University of Minnesota-Duluth, and Dr. Subhash Basak, a chemistry researcher at the Natural Resources Research Institute (NRRI). The project researched the possibility of using emerging neural network technology in Quantitative Structure-Activity Relationships (QSAR) analysis of anticonvulsant drugs.

After the completion of the research program, I stayed on and worked at NRRI for the summer of 1993, to continue my research. During this time I built a neural network construction and testing application, named MODEL, so the research could continue after I departed.

Additional information available online.