Graduate Seminar 2004 Abstracts

The JVM and CLR

Dan Agar

Abstract: The JVM (Java Virtual Machine) and CLR (Common Language Runtime) are the underlying runtime environments for Sun's Java and Microsoft's .NET platforms respectively. These two rival technologies are currently competing with each other for developers. Powered by the architecture-neutral JVM, Java has long been a favorite development platform for large scale enterprise applications and has also made some progress into the world of the desktop. Microsoft's recent release of its language-neutral .NET platform, however, may erode away at Java's market share in these areas. This seminar will focus on an examination of the underlying runtime environments of these two technologies. It will look at many of the technical aspects of these two virtual machines and compare things such as runtime efficiency, safety, and extensibility of the two environments.

Back


Processor-in-memory Computing

Sylvain Eudier

Following Moore's Law, microprocessors' performances have been doubling every 18 months for many years now. While the production processes become smaller and smaller, the microprocessors get bigger dies with fast clock cycles. As a consequence, the maximum distance between two points in the circuit increases in terms of clock cycles. The solution is to implement pipelining despite tradeoffs like latencies, longer cache access and the complexity involved in designing the chip. However, the biggest problem comes from the bottleneck (called the Von Neumann bottleneck) between the processor and the memory: while the former regularly doubled its performance, the latter improved its bandwidth by only 7% per year. As this gap became a real problem, solutions like caching, multithreading or branch prediction were developed. Even though these techniques represent significant improvement, they are not profitable for all algorithms: caching, for example, is not useful in many data-intensive applications. The solution to this problem is a brand new architecture called Processing in Memory (PIM) that avoids the Von Neumann bottleneck described above. The idea is to integrate high-density DRAM and CMOS logic on the same chip, i.e. array of processing units inside the memory. Such architectures are expected to provide higher scalability and lower power consumption for parallel systems, offering the possibility of real parallel computation.

Back


Program Compaction

Ben Fruchter

Abstract: Play a game, listen to music, take a picture, make a phone call, take over a small country, all but the latter can now be easily accomplished with a little device that fits into your pocket. With the increase in demand for powerful mobile technology hand held devices can now accomplish what many people thought even the most powerful computers would never be capable of, just a few short decades ago. This push for greater capability in smaller, more efficient packages has presented a higher challenge than merely acquiring new functionality. While the constraints that mobile technology places on embedded software are hardly new, the modern world of traditional computing rarely needs to be as concerned with such restrictions. The three main considerations outside of the arena of full functionality and dependability are space, energy consumption and run time reaction. In order for mobile technology to mimic the same functionality of less constrained systems the code must be reduced to fit into more restricted memory, take up less power to run the code, and yet still react nearly as fast. This seminar will focus on code reduction methodologies with particular attention being paid to code compaction.

Back


Constraint Programming

Kerem Kacel

Abstract: Constraint Programming is an alternative approach to programming that relies on reasoning and computing. It constitutes a two-phase programming process: representing the problem as a Constraint Satisfaction Problem, and solving it. The concept of constraints can be integrated to all-purpose programming languages like C++ and Java, or it can be combined with Logic Programming techniques to form CLP (Constraint Logic Programming). This seminar will be about the history, principles and applications of Constraint Programming. It will also get into some detail about CLP. Some of the applications include DNA structure analysis, timetables, scheduling problems, AI, puzzles and finance.

Back


Network Protocol Simulation: A look at Discrete Event Simulation

Grant Lanterman

Abstract: The rapid growth of the Internet has brought about a need for efficient and effective protocols to drive its communications and software. The assurance testing of these protocols can no longer be accomplished using the small test networks of the past. Arising from this issue are many network simulators such as OPNET. Most of these simulators rely heavily on the techniques of discrete event simulation to effectively model the dynamic nature of the Internet. In this seminar, we will look at these techniques in some depth with a simple classic Bank queuing example.

Back


Mass Storage and Information Retrieval

Paul Mazzotte

Abstract: All public and private companies large and small as well as government facilities that rely on computer systems for any part of their business are in need of data storage systems. Data storage systems are responsible for housing critical business information. Data backup and restoration of this information is critical for the survivability of these companies in the case of a storage system failure or a physical disaster such as a fire. Currently, enterprise storage consolidation is on the rise because it increases efficiency, decreases redundancy, and simplifies management. This seminar will discuss the past, current, and future data storage and data backup paradigms with particular focus on the latest paradigms which address storage consolidation-- Network Attached Storage (NAS) and Storage Area Networks (SAN).

Back


Agile Software Development: An Alternate Approach

Umar Munroe

Abstract: As the popularity of the Internet and portable electronic devices (i.e.PDAs, mobile phones, etc) has increased in recent years, the demand for software in these volatile markets has also increased. In addition to the increased demand has come the need for this software to be developed quickly and be able to adjust to changing requirements. Industry software experts, however, noticed that software development teams in most companies could not meet the challenge. Some industry experts attribute the problem to the traditional heavyweight (highly structured, document driven) processes most companies have adapted as standard practice for software development. These heavyweight process, many believe, are often too restrictive an approach to the dynamic nature of software development. The proposed alternative is Agile Software Development. Agile Software Development refers to lightweight (less structured) development approaches characterized by their ability to adapt to changing requirements throughout development while producing quality software quickly. These agile approaches place less emphasis on process and more on open, honest communication. Adaptability is achieved through an iterative approach to development with incremental delivery/demonstration to the customer. Customer feedback is frequently elicited and functionality adjusted to meet requirements throughout the entire development process. The result is an end product that always reflects the customers' current and most important project requirements.

Back


The Digital Library

Yan Situ

Abstract: In the last decade, digital libraries have emerged as a key approach to managing and searching global information spread across distributed collections. They have evolved rapidly and are now as varied as physical libraries. Digital Libraries stand on the intersection of many fields including information retrieval, computer science, and networked systems. Different views and perspectives have led to differing definitions. In this seminar, we are going to talk about the digital library's history, theories, and components. We will see how easy it is to build a digital library.

Back


Grid Computing: The Promise and Challenges of a New Model for Computation

Tom Smith

Abstract: The term "grid computing" has been around for several years now and has been gaining increasing notoriety in both the popular media and scientific community. However, in spite of some notable successes in corporate, academic, and scientific arenas, there is still little agreement about what formally constitutes a "grid." This seminar will tease out the various flavors of grid computing, and will look in depth at several emerging standards, platforms, and implementations. Future directions and challenges of grid computing will also be explored.

Back


Program Refactoring

Mitch Soden

Abstract: "If it ain't broke, don't fix it." That is often the sentiment of software developers. After all, programmers get paid to write software - new software or enhancements/fixes to existing software, but not to change code that works perfectly well already. The discipline of refactoring does just that. It improves the design of perfectly functioning software in order to avoid unnecessary complexity and ease maintainability. This seminar will focus on the question "Is it worth the effort?" as well as address why refactoring is different than any of the other software engineering practices that developers know they should be doing, but never get around to.

Back


Domain Engineering and Software Reuse

Jacob Szwarcberg

Abstract: Domain engineering has been researched and explored as a way to improve and enable software reuse in software systems. It was implemented into several highly funded projects in the late '80s and '90s in order to prove its effectiveness. The objective of this seminar is to present the audience with a fundamental understanding of the processes behind domain engineering and to explore its current real world significance. An introduction to domain engineering and its processes will be discussed, along with case studies and current implementations.

Back


Election Automation

Cheryl Yager

Abstract: The act of voting has taken place throughout history. As new developments and principles have arisen, new technology has emerged to aid in this age-old, yet fundamental right as an American citizen. However, recently, many voters have lost faith in the current election process, especially when it comes to the devices that are being used to record and tally votes. The Presidential election of 2000 proved to America that old technology just "doesn't cut it," while newer technology has the inherent problems of being the "new kid on the block" and is just as untrustworthy. As a result, there is a political and technical debate among many of America's government leaders and computer scientists about what can be done to improve the mechanisms that voters use to cast their ballots on Election Day. It is important to understand that under current law, each individual state is entitled to designate their voting practices. This means that some states may take the venture into technology and implement the latest voting devices, while other states may choose to adopt the "if it ain't broke, don't fix it" attitude and retain their legacy voting equipment. However, there is a nationwide "call to arms" to focus on issues concerning security, usability, and vote verification that have surfaced with the development and implementation of DRE voting machines (Direct Recording Electronic), as well as the devices used to electronically tally ballots, such as those that tally punched cards.

Back


CS Masters Program Homepage