Over the past few years, I have been lucky enough to get the chance to work on a very broad range of technologies including embedded systems programming, robotics, multimedia programming, distributed programming, web application development, versioning systems, system administration, LaTeX documentation system, RDBMS and embedded DBs, GUI development, website design, and system application programming. Some of the projects I have developed are listed on this webpage. For other information you may download my resume or you can email me.

Some related academic achievements:
  • ICSE: computers as preferred course of study (top of the class, 95% in nation-wide examination).
  • CBSE: computers as preferred course of study (top of the class, 88% in nation-wide examination).
  • President: Computer Society, 8/2000-6/2002, Institute of Technology & Management, India.
  • Designed first version of college website (www.itmindia.edu).
  • Bachelor of Engineering in Computer Science: 1st Division, Institute of Technology & Management.
  • U.S. Mensa Member, 2/2003 to present.
  • GRE Scores: 2210 / 2400, Verbal: 630/800, Quantitative: 780/800, Analytical: 800/800.
  • TOEFL Scores: 280/300, Writing: 6.0/6.0.
  • Master of Science, Computer Science, Robotics, Texas Tech University.
  • Research Assistant, AI Robotics Lab, Texas Tech University.
  • AI Robotics: received an "A", aced the projects.
  • Scholarship for entire Master's Program, Fall 2002 to Spring 2005.
Courses taken in Master's Program:  
  • Advanced Operating Systems
  • Intelligent Systems
  • Distributed Computing
  • Reinforcement Learning
  • Object Oriented Software Development
  • Logic for Computer Scientists
  • Multimedia Systems
  • AI Robotics
  • Neural Networks
  • Master's Thesis
  • 1 year of Practical Training at Resolution Systems Inc.


I've been using Eclipse for all development for a long time now. Moreover, I was also using SourceJammer as the versioning system of choice until some time ago. So, I wrote a SourceJammer plugin for Eclipse: for lack of a better name, I call it KSJEclipse. Now I don't need to leave the Eclipse IDE to perform most versioning actions. I can checkout/checkin, see the version status for all files in the projects. Makes my life really easy, one less program to leave open, file status is instantly visible speeding up development.

The binaries, documentation and source is available at SourceForge.
More information about the product can be found at its homepage here.
This is my first Eclipse plugin, so please handle with care. Try it out on some random project before using in production. For feature requests/comments you can email me using the form here.

screenshots:  1  2  3  4  5  6  7  8  9

SourceForge.net Logo

Editors and IDEs

I have been using the Eclipse IDE for all development for a long time now: Java, C++ as well as UIs. Well, what can I say, I find it much much better than Netbeans. First of all, the IDE is very appealing. Secondly, its extremely powerful, and fast. So now, Linux or Windows, Eclipse it is :-)

On Linux , I like to use Xemacs and Vi. I used to carry out C/C++ development and LaTeX typesetting on Xemacs, and all Java development on NetBeans. On Windows, I also like to use EditPlus2, which is an amazing editor, customizable for nearly every language. I use Windows to carry out all my work in Adobe Photoshop, Macromedia Dreamweaver, Macromedia Flash, Visual Basic development, and some of the other Windows-only tools. Other useful tools are Putty for SSH and PSFTP for secure ftp.

The Beginning : GWBASIC

The first program I wrote was in GWBASIC. Yeah, my relationship with programming goes back a really long way. Began learning how to program when I was 12 yrs old, and had achieved a considerable reputation for myself by the time I was 15.

When I was 13, I enrolled in a summer workshop to learn Dbase3+ (the hottest database at the time). I was the odd one out as the rest of the class was composed of students 25+ years old. But, it was very interesting and my first experience with databases was a good one.

My final project in Computers for the ICSE board examination was written entirely in GWBASIC and was 1.5 Mb in size, which was unheard of for school projects in those days. I had written DOS Tutor -- to teach DOS to the user. The project had some heavy duty pixel and DOS window color manipulation to create highly-animated graphics, used a primitive database (that I wrote, entirely in GWBASIC) to make a glossary, several menus, lots of exercises for the user, even storing the user's data on the file-system (my first experience with sessions). That glossary I feel, is one of the coolest pieces of code I have written, entirely browsable using arrow keys, just select a letter, hit enter, shows the words available in a window on the right-hand side of the screen, hit tab to go to the window on the right, hit enter on a word, and u can see the definition! These kind of interfaces are very common today, but were not at that time. Received full-marks for the project, graduated top of the class, securing a 95% in the nation-wide examination.


After GWBASIC, worked shortly on QBasic, didn't didn't get enough time to learn it, and then, I stepped into the world of C/C++ and it was a new world! Object-Oriented Programming, an awesome graphics library, control functions for the keyboard, so much more....I was in Heaven! I had access to Borland's TurboC++ at the time and wrote some amazing programs. I wrote applications with nifty graphics, database accessibility, some simple games, system manipulation, file management and much more. I found C and C++ to be so powerful that I almost never felt the need to learn another OOP language. But then I also discovered the sweet world of Java....

I would always refuse to learn Java, because I considered the coding to be bloated and the programs to be slow. True, it was supposed to be platform-independent but as a developer I did not think of the application of the programs I wrote --I was only concerned with enjoying the development experience. And I considered myself an expert in C/C++, so why learn another OOP language? Well, I had to learn it -- I was assigned a project which was to be written in Java and was to include an interface written using Swing. So I worked on Java, and I liked it. In the real-world portability is a major issue, and Java solves that issue. Moreover the sheer extent of libraries that are available to you in the Java framework make it very easy to do a lot of tasks. Now with skinnable look and feel decorations for Swing, with both Eclipse and Netbeans providing rich control application platforms, along with Eclipse's SWT, Java applications can look good too.

But both C/C++ and Java have their own niche. while C code is definitely amaaaaazingly fast, java code is portable. Its a cinch to write embedded software in C, not so in Java. But right now, I enjoy both equally :-)

I have always had an interest in AI, and so have given my share of attention to Prolog. But this language didn't really appeal to me, 'coz I couldn't see how it would be applicable in the real world, when it couldn't even provide a decent interface to the user. But I did get to be pretty good at programming in it. Then I didn't really develop on it for some time. However, I have recently found GNU Prolog which is a compiler for Prolog, allowing socket connections to Java and C/C++ programs. So I can actually use the power of AI provided by Prolog and combine it with the power of object-oriented languages. Having studied prolog twice (undergrad and master's) I am rapidly picking it up once again, and am enjoying the experience that this powerful combination provides.

Visual Basic

During my sophomore year (2000), I developed an interest in Visual Basic -- the ease of designing powerful applications with full-fledged interfaces appealed to me. I developed several applications, the most significant of which are listed below:

1. Sleep Program -- I wanted to be able to sleep to music. This program automatically shut down my computer after a specified interval of time.

2. Library Management System -- I submitted this application as my final project in junior year. The name explains all. Uses an Oracle database.

3. Buddy -- This is one of my masterpieces. I studied Microsoft's API for Speech and MS Agent. It displays an animated character on your screen, and responds to voice commands to control your computer. When it starts up, it scans your computer for all installed programs. It employs Windows API calls to control the computer so that you can open any program just by saying its name. You can minimize/maximize/close windows, shutdown/logoff your computer, control the screensaver, and so much more.

4. createPLS -- I run a mp3 server so that I can listen to my music no matter where I am. The server has a php/html web interface. So to play a song, I just have to drag its link into winamp on windows and xmms on linux. This process of dragging each and every song one at a time was getting to be a real pain. So to solve this problem I developed this application which scans a specified directory tree for mp3 files and generates playlist files in every directory, named the name of the directory they are in. Can also automatically delete all PLS files in any tree. Now all I do, is drag and drop the PLS file into winamp/xmms and automatically get all the songs in my playlist :-) This program can also be executed remotely.

SleepProgram, LMS and Buddy have been written for Windows 98 and have not been tested on other version of Windows. createPLS has been developed on Windows XP but should work on all versions of Windows because it does not make any platform-specific API calls.

Research Assistant: AI Robotics Lab, Texas Tech University

In the year 2002, I enrolled at Texas Tech University to get a Master's degree in Computer Science, majoring in AI Robotics. Dr. Larry Pyeatt agreed to be my Advisor and I got a small office in the AI Robotics Lab. As a RA in a robotics lab, I had to work on, you guessed it, robots!

There were several projects going on at the same time:
visual representation of the world
reinforcement learning & robotics
mapping using dynamically expanding occupancy grids
localization using dynamically expanding occupancy grids
database connectivity between mapping algorithms and oracle/mysql
designed a prototype for a light-weight, disposable planetary rover
...lots more

Book list:
Introduction to robotics, Dr. Robin Murphy
Behavior-based Robotics, Dr. Ronald Arkin

The project development phases involved making good design, sound mechanical construction, programming, calibration, and report generation using LaTeX.

1. Dead Reckoning & Time Based Navigation (Adjudged top of the class)
Class project to design a robot which would move on a specific path and return as close as possible to its starting position in the shortest amount of time. Used a Lego Mindstorms Kit and the NotQuiteC programming language. The dead-reckoning version involved counting the no. of rotations of a wheel to measure how far the robot has moved, when its moving straight or turning. Learnt that no matter how good a programmer you may be, you have no control over the dirt on the floor in a robot's path, which will make it change course. Nor do you have control over the wear-and-tear of the robot's joints, and rubber on the wheels. The time-based version required measuring the amount of time taken for one unit rotation of the wheel, and using that measure to move the robot a certain distance. A lot of mechanical and electrical engineering also went into this project to reduce the effects of the environment as much as possible. After near perfect runs, with near perfect 90 degree turns, the robot was a mere 2 inches from its starting location using either of the two algorithms.
source pic1 pic2

2. Line Follower (Adjudged top of the class)
Class project to design the most accurate and fastest line-follower robot. The robot was made using the Lego Mindstorms Kit and the NotQuiteC programming language. The goal was to use light-sensors to detect and follow a line made using black-electrical tape on the floor. It was made really complicated because the line crisscrossed over itself, and the room was filled with fluorescent light, making it difficult for the light-sensors to detect the lines. Completed the run in 56secs (under one minute, everytime time it was run). The first runner-up's run took more than 4 minutes. Employing some smart engineering, after 3-4 prototypes, moved the RCX Brick onto a trailer, to reduce the weight (front-wheel drive). So the front moved/turned really fast, and the trailer with the heavy RCX brick simply followed it.
source pic1 pic1 pic3 pic4 pic5 pic6

3. Mapping & Obstacle Avoidance Wandering (A Grade)
Class project to design a navigational and mapping obstacle-avoidance algorithm. I based my implementation on the HIMM algorithm proposed by Borenstein and Koren. The application has been written to run on the Nomadic Technologies Super Scout II Simulator. Its been written entirely in C++ on a Linux machine and utilizes the Scout API extensively. This application was to achieve two goals: wander in the environment without hitting any obstacles that may be in the robot's path, and to make a map of the environment as the robot sees it. The map was to be made using Occupancy grids.
source pic

I also gave a few presentations during this period, two of which are:
1. Mapping and Localization
2. Monte Carlo Localization

A Water Management System was to be developed as a joint project between the Department of Civil Engineering and Computer Science. As a RA, the remainder of this project came into my hands. The control system was to take data from pH, pressure and ORP sensors and insert the readings into an Oracle database. The system was also to provide a portable application and a web-accessible interface to monitor and control the system. This system was developed in C++, with the user interfaces written in Java/Swing. The software interfaces with the physical sensors using CORBA objects. I developed the Java/Swing interface making good use of JFreeChart and JFreeReport. After a couple of prototypes I was able to really improve on the interface, displaying a lot of data utilizing minimal screen space. I also developed the php/oracle web application so that user's can view the data from anywhere in the world, only requiring an internet connection.

Advanced Operating Systems

I have studied this course twice, once as a sophomore and once during Master's. During Bachelor's this course was all about theory, but when I was taught this course during the course of graduate study, it was all about class presentations and projects. The theoretical concepts learned were to be actually developed.

The course involved writing different OS modules. The operating system was to be based on the Linux OS and to be tested on Simics. The different modules implemented during the coursework were:

Kernel: The core of the OS was a simplified version that was capable of interpreting a few internal commands and could load external elf compiled binaries. A few basic system calls were incorporated into this kernel.

File System: The file system was built from the scratch up to implement the ext3 file system. It involved writing block device drivers for handling access to the file system. Inode/Zone manipulation, as well as directory management was incorporated into this module.

Memory Management: A basic memory management system was also developed incorporating chunk allocation, developing malloc() and free() functions, and memory division into slabs.

Terminal Emulation: This module provided a character console for the OS to input commands, as well as an output console. It included a parser and a character device driver.

Distributed Computing

There were three class projects in this course at Texas Tech University. A very interesting subject where I actually applied and implemented what I had learnt about distributed computing. Threads, Marshalling, RPC, UDP Sockets etc. were implemented in the class exercises and projects.

1. Client Server Communication - being the first project, this was relatively basic construction of servers and clients communicating via sockets. There are three exercises with an increase in the level of difficulty for each exercise.

2. RPC Communication -- developed a math client/server architecture which would accepts messages containing an operand and arguments. It would process the instruction and return the result message to the client. This project employs Sun RPCs. The arithmetic expressions typed at the client side generate RPCs to the corresponding procedures on the server.

3. Multiple Servers -- Clients send a math expression message to a dispatcher server, which parses the message and sends the operands to the corresponding add, subtract, multiply or divide servers. The expression is processed on the intended server and the result is sent back to the dispatcher and then back to the client. This project demonstrated communication between multiple servers and multiple clients with one central server.

Reinforcement Learning

Reinforcement Learning is that branch of Artificial Intelligence where the agent learns with experience, and with experience only. No plans are given, there are no explicitly defined constraints or facts. It is a "computational approach to learning from interaction". The key feature of reinforcement learning is that an active decision-making agent works towards achieving some reward, which will be available only upon reaching the goal. The agent is not told which actions it should take to reach the goal, but instead it discovers the best actions to take at different states by learning from its mistakes. The agent monitors its environment at all times, because actions taken by the agent may change the environment and hence affect the actions available to the agent from the environment. The agent learns by assigning values to states and actions associated with every state. When the agent reaches a state that it has already learnt about, it can exploit its knowledge of the state space to take the best action. At times the agent takes random actions: this is called exploration. While exploring the agent learns about those regions of the state space that it would otherwise ignore if it only followed the best actions. By keeping a good balance between exploitation and exploration, the agent is able to learn the optimal policy to reach the goal. In all reinforcement learning problems, the agent uses its experience to improve its performance over time.

Reinforcement Learning: An Introduction by Richard S. Sutton and Andrew G. Barto

Final Project:
Present a performance comparison between two popular Reinforcement Learning algorithms: the Sarsa-Lambda algorithm and Watkin's Q-Lambda algorithm. The algorithms were implemented on two classic reinforcement learning problems: the Pole Balancing problem and the Mountain Car problem. It is necessary to know which algorithm to use for a specific reinforcement learning problem. The aim was to indicate which one is a better algorithm, and under which parameters, for two types of problems. The mountain car problem is an example of an episodic control task where things have to get worse (moving away from the goal) before they can get better. The pole balancing problem is an example of an episodic control task where the agent tries to reach the goal at all times.
sourceMC sourcePB

Multimedia Programming

This was an amazing class!! The projects required implementation of the theoretical concepts learned into real world applications. Some of the concepts I learned in this course were data compression and coding, image and video indexing and retrieval and multimedia information systems. A brief description of the projects is given below:

1. Voice Over IP -- Interface with the audio device driver and realize real-time communication by sending audio packets over the Internet. source

3. Lena -- Images constructed directly from wavelet coefficients. Lena was transformed using Daub4 filter for two-level wavelet decomposion. images

3. Indexing of Scenes of a Compressed Video Sequence -- The objective of this project is to detect scene changes in a compressed movie and index them, with the output being presented as a mosaic of images. The compressed movie sequence is a Mpeg-1 file. The program scans the MPEG-1 video sequence and extracts frame data, identifying scene boundaries automatically without decompressing the video data. For each scene, the program selects a picture as its representative image for use in the index. Then the program decompresses the chosen pictures and reduces their size to place them in the mosaic. The project must return output in the form of YUV files of the final mosaic image. This image is finally viewed by converting the separate files to a ppm image. The programs were written in C++ and compiled using g++ version 3.2.2 on a Redhat 9 machine. The project was successfully ported to Solaris.
mpg ppm paper

Neural Networks

An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information. The information processing system is composed of a large number of highly interconnected processing elements (neurons) working together to solve a specific problem. ANNs, like people, learn by example. An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process. Neural networks, with their remarkable ability to derive meaning from complicated or imprecise data, can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. A trained neural network can be thought of as an "expert" in the category of information it has been given to analyze.

Final Project:
Goal is to develop a menu-driven program in C++ to test the various algorithms that were taught in this class. Models to be included in the project were:
1. Single Discrete Perceptron
2. Multiclass Discrete Perceptron Classifier
3. Single Continuous Perceptron
4. Multiclass Continuous Perceptron Classifier
5. 2-Layer Back-Propogation Network
6. Hopfield Auto-Associative Memory Storage
7. Bidirectional Associative Memory Storage
8. Kohonen Competitive Learning
To implement this project I made one cpp file with the menu, and one header file for each NN Learning Algorithm.



TinyOS is an open-source operating system designed for wireless embedded sensor networks. It features a component-based architecture which enables rapid innovation and implementation while minimizing code size as required by the severe memory constraints inherent in sensor networks. TinyOS's component library includes network protocols, distributed services, sensor drivers, and data acquisition tools - all of which can be used as-is or be further refined for a custom application. TinyOS's event-driven execution model enables fine-grained power management yet allows the scheduling flexibility made necessary by the unpredictable nature of wireless communication and physical world interfaces.

TinyOS has been ported to over a dozen platforms and numerous sensor boards. A wide community uses it in simulation to develop and test various algorithms and protocols. New releases see over 10,000 downloads. Over 500 research groups and companies are using TinyOS on the Berkeley/Crossbow Motes. Numerous groups are actively contributing code to the sourceforge site and working together to establish standard, interoperable network services built from a base of direct experience and honed through competitive analysis in an open environment.

-- from the TinyOS website

When I started working on TinyOS, there were about 40 persons using it, and now there are hundreds. TinyOS uses a C-style programming language and a custom compiler NesC. TinyOS was initially developed by the UC Berkeley EECS Department, as an event-based operating system for use with embedded networked sensors. TinyOS runs on motes which are available from Crossbow Technology Inc. I also wrote TinyOS programs for the MICAs, the MICA2s, the MICA2DOTs and the newer Telos motes.

TinyOS is one of the most interesting technologies that I have had a chance to work with. It gives the power of a computer to a tiny mote. Networking, Sensing, File System, etc., are available on these motes via TinyOS. Although I worked on several projects involving sensing, data collection, networking and serial communication, the two most significant projects that I developed were:

1. TCP -- reliable communication between two motes. Like any packet-transfer-network, packets can get lost. This layer allows for reliable communication between two motes, thereby completely eradicating the chance of data loss.

2. Routing -- The motes can be programmed to multihop packets. Say there are three motes aligned like this: A <---> B <---> C. Now I want to send a packet from mote A to C, I cannot send it directly because mote C and A are not within each other's radio range. So A will simply broadcast the message, B will receive it, detect that this message is not for itself and rebroadcast it. Then mote C will receive all packets being broadcasted from anywhere, detect that one of the packets is for itself, sent by mote A and so, will use it, instead of rebroadcasting it. However, the problem becomes much more complicated when there are many motes involved and you want the data packets to reach the relevant mote in the shortest possible time, without losing data due to collision. Developed a routing protocol which would run on each mote and figure out the best route to send a particular packet, to achieve the above goal.

This was my first experience with embedded software development, and it felt good. The feeling that you get when you see that some piece of code that you have written is actually making something happen (in a physical sense), is different from a regular computer program. With embedded systems, YOU are the operating system, telling the tiny pieces of hardware what to do with what parts of which data! Now that's control!

MSP 430

After my experience with TinyOs, I am really into hardware programming. I was delighted when I was assigned another embedded systems project. I was to write code directly for Texas Instrument's MSP430 microprocessors. So, no computer in between, no OS, just plainly code this pin, code that pin, input these bits and output those bits, light this led and blow that horn! It was amazing! I was to use TI's MSP430F449 Board which has 60k Bytes Program Flash, 256 Bytes Data Flash, 2k Bytes RAM, a JTAG connector, an LCD, one LED, 4 buttons, one buzzer and some more stuff. Now the hard part was understanding the processor manual. That took some time. There are no guidelines, just instruction A does this, pin B is for that, Apply High here, and a Low there -- the manual is not for beginners! But I got the hang of it and the first productive program I wrote for the processor was the LCD library. The programming is C style but uses native MSP430 mappings.

1. LCD Library
To make the LCD library I wrote some functions of the sort: initlcd(), clearlcd(), wait(msecs), writeletter(posn, char), writeword(string), writedigit(posn, char), writespecial(char) and a few more which can then be called from any program, just by including this lcd library. The goal behind making the library was to allow me to easily debug other applications which would actually make use of the processor. Having an LCD lets me check out the results and verify that correct inputs are being sent to the processor.

2. Clock Library
Developed an interrupt-driven clock, the speed of which I can control. This lets me use my own clock in applications where I need to send signals on a pin to control the operation of another processor. Primarily made this library for use in the I2C project.

3. Buttons Library
Developed a library for the buttons on this board. Like other libraries this one can also be used by other applications, to send input or to initiate a signal for a particular application to start/stop/pause it.

To compile and install the code I had written I had to install the mspgcc toolchain for TI's MSP430 MCUs. I also had to install the JTAG library so that I could communicate with the processor for application management and debugging.

I2C: communication between two bare-bones MSP430 processors

Well, I had two MSP430F449 Boards so now the next question was, how to make them talk to each other comfortably so that they can have a lasting relationship? I didn't have pins to spare, they were all being used for other stuff (sensors, leds etc). So how do I make the two MCUs communicate reliably, efficiently using the minimal number of pins? Philips had the answer: the I2C bus.

In the early 1980, Philips created the I2C bus. The I2C bus is a control bus for communication between various integrated systems in a system. It is a two-wire (one for data, one for the clock) bus and the protocol provides upto three levels of data transfer rate: upto 100Kbps, 400Kbps and 3.5Mbps. Two simple lines can connect all the ICs in a system, because any I2C device can be connected to a common I2C bus. A Master device on the bus can then communicate with any Slave device. The I2C bus protocol was ideal for this project. However, the MSP430s did NOT have any I2C controller. So now what? Here's what: I got down to writing a completely software implementation of the I2C bus. A longer description of the protocol is given here.

The I2C protocol defines a master-slave relationship between components, i.e. the master controls the clock, and asks for data from the slave or sends data to the slave. The slave simply responds to the master's requests. Only the master can initiate communication. There are usually many slaves on an I2C bus and only one master. But this project wanted more! Any of the two MSP430s should be able to become the master, taking control of the bus and making the other the slave. Took me two weeks, and 3 prototypes, but I did it! The entire application is completely interrupt-driven. Initially both the processors are in the slave state. Then one of them reaches a condition where it would either need to send or receive data to/from the other processor. So this processor becomes the master by dropping the data line while keeping the clock line high. The slave intercepts this signal as the start condition of the I2C protocol. Then regular I2C transfers take place. The master sends some bits, raises/lowers the data/clock line as required, the slave responds and raises/lowers the data line to send/receive the data (according to the protocol). At the end of the transfers, the master raises the data line while the clock is high. The slave understands this as the stop condition. The master processor then reverts back to the slave state. Once again both processors are slaves and anyone of them can become the master! A finite-state machine in the program maintains the state changes of the bus and the processors. The program is written entirely in C for MSP430s and can only be compiled using the msp430gcc toolchain.

It was amazing to see one of the processors display a message on its lcd and then, you hit a button and the same message appears on the other processor's lcd. The two processors are connected using only two wires. Since I wrote the clock library, to be completely configurable, I can really slow down the I2C communication and achieve data transfers between processors even if the distance between them is very high, and even if there are many devices on the bus. Since my application conforms to the exact I2C standards defined by Philips, it will work beautifully even if a hardware I2C device is connected on the bus.


Over the past few years, I have attained a certain degree of expertise writing papers in LaTeX. All papers I have written have been typeset using LaTeX. LaTeX is a document preparation system for high-quality typesetting. Since it is not a word-processor, the writer does not have to worry about the layout of the document. The writer's job is to get the content right, and to place it in a proper place in the document. Latex takes care of the rest. Global customization allows the user to apply changes to the entire document just by changing a few lines. While it does have a bit of a learning curve, I have found LaTeX to be a much better program for document preparation (especially, if the document has mathematical equations), than any word processor I have used.

Web Application Development

Apache Tomcat is a servlet container which implements the official Java Servlet and JSP technologies. A regular web server (like apache) cannot process the java programs which are used in developing an enterprise web application. These applications are basically servlets and must be run inside a servlet container: Tomcat. Tomcat and Apache can be coupled together (using mod_jk), so that apache can forward all requests for the web application to tomcat. I have a lot of experience administering and running applications inside tomcat. Tomcat provides excellent security, and a great deal of flexibility and control to help a developer maintain applications. However, I now use JBoss AS for hosting my web applications. JBoss AS has tomcat built-in.

JBoss Application Server
JBoss Application Server is the #1 most widely used Java application server on the market. A J2EE certified platform for developing and deploying enterprise Java applications, Web applications, and Portals, JBoss Application Server provides the full range of J2EE 1.4 features as well as extended enterprise services including clustering, caching, and persistence. After having worked on Tomcat for quite some time, I ported most of my applications to the JBoss Application Server. Since JBoss integrates tomcat within itself, its not difficult to port your tomcat web applications to JBoss AS.

JBoss Portal
A product of JBoss Inc., it provides a sound platform for deploying Portals. JBoss Portal hosts and serves the portal's web interface allowing for several customizations. I have written several portlets conforming to the JSR168 standard. The advanced portlets that I have written are each a web-application by themselves, using the portal as their interface so that many web-applications can be visible and be usable by the user at the same time. The concept of portals and portlets greatly enhances a web application. The user can customize the look and feel, of what the portal looks like to him. What would otherwise be a huge web application can now be broken down into smaller loosely-coupled portlets, allowing a user to access different areas of the entire application at the same time. You can download the most basic of portlets, a "Hello World" portlet here.

I have worked on every stage of an enterprise web application development process. From database administration to making javascript menus, I've designed and developed applications. Most of the web applications I have written employ the following technologies:

MySql is a highly scalable, SQL compliant, open source object-relational database management system. MySql is the most reliable, easy to administer and powerful database I have ever used.

PostgreSQL is another highly scalable, SQL compliant, open source object-relational database management system. I wouldn't call myself a Postgresql expert (there is always more to learn), but I can certainly get things done on it. Postgres has formed the database back-end of several web-applications that I have written.

C-JDBC is a free open source database cluster middleware that allows any Java application (standalone application, servlet or EJB container, etc.) to transparently access a cluster of databases through JDBC. The database is distributed and replicated among several nodes and C-JDBC balances the queries among these nodes. C-JDBC handles node failures and provides support for checkpointing and hot recovery. As part of the administrative side of what I do, I have set up, configured and maintain C-JDBC database clusters. C-JDBC provides a virtual database (a replica of the actual database) to the web application. So, instead of accessing the database directly, web applications perform all database operations on CJDBC, which to them, is a regular database. You can download a sample C-JDBC virtual database configuration file here. A java program to connect to a C-JDBC data source can be downloaded here.

Hibernate is a powerful, ultra-high performance object/relational persistence and query service for Java. Hibernate lets you develop persistent classes following common Java idiom - including association, inheritance, polymorphism, composition, and the Java collections framework. Hibernate allows you to express queries in its own portable SQL extension (HQL), as well as in native SQL, or with Java-based Criteria and Example objects.

Now you can either write line after line of complex sql statements in your java code and access your database via jdbc or you can map your object classes to database fields in hibernate association xml files and let hibernate do the job for you. Thats the motivation behind any object-relational system. Hibernate lets you use your class structure to access the database. You define the methods that are available to the Database Access Objects to extract/push data from/to the database. I have used Hibernate extensively. All web applications I have written use Hibernate. The basic structure of the hibernate part of a web-application can be downloaded here. It shows the mapping of a sample "users" table, and DAO methods, and the hibernate.xml file.

A product of the technical genius at Jakarta, Struts is a java framework for building web applications based on the Model 2 approach. Model 2 is a variation of the Model-View-Controller paradigm. Struts provides the Controller component in this architecture. In my web-applications, Hibernate is the Model, and Freemarker templates provide the View. Struts is configured for a web application by using a deployment descriptor much like the standard web.xml. Every web-application I have written uses Struts.

FreeMarker is a "template engine"; a generic tool to generate text output (anything from HTML to autogenerated source code) based on templates. It serves as a good replacement for JSPs and by not allowing for much logic to take place in a freemarker page, it somewhat forces the use of of the MVC paradigm -- which is a good thing. This separation of design and programming logic is good for the programmer and ultimately leads to a clean and maintainable application. It has some programming constructs to help in the display of data. I have been using freemarker, for a long time now, to generate the web pages for my web applications. I had been using JSPs, but once I found freemarker, every web application I have developed applies freemarker for page generation. A sample ftl file can be downloaded here. As you can see, this freemarker template contains definitions for the struts tag libraries, javascript, html code and struts bean tags, like a regular JSP would, but it does not contain programming logic -- it is only concerned with the display of the data.

HTML is the new common language of the world. The web wouldn't exist without it. I learnt HTML via self-study in 8 hrs in freshman year. Hand-coded webpages for some time, then discovered Dreamweaver and moved onto learning other things, like Flash, Photoshop, Javascript and java applets. CSS helps me maintain the look and feel of web pages, whereas DHTML/JavaScript adds flair to otherwise static HTML pages. Having attained a great deal of experience in web page design earlier on, I find it very easy to design interfaces, so in a web-application, that is the phase that I go through right at the end.

Once you use log4j, you cannot go back. Thats so true! System.out.println() for debugging, just doesn't cut it anymore. I MUST use a good logging tool: Log4j it is, always was and will be in the future.

Other technologies that I have worked on are Maven, Pluto Portal, uPortal, Struts Portals Bridge, PHP and more...

Web Interface Design

Before I was introduced to enterprise web application development, I learnt how to make interfaces look good. I self-taught HTML, JavaScript, DHTML, CSS, Macromedia Flash, Macromedia Dreamweaver, Adobe Photoshop, PHP, and any graphics manipulation software that I could get my hands on. Dabbled with 3D animation software, ULead software, CorelDraw and really, any software that I could get my hands on: demo CDs that came with magazines, demos from the net...

My interest in interfaces led me to take the initiative and develop the official website for the school where I was getting my Bachelor's degree in Computer Science (Its in India: a lot of institutes did not have websites at that time). Highly praised for that job, I was even more motivated to improve my skills. I came to the US and got a job developing web sites for the Department of Petroleum Engineering at Texas Tech. I have also designed the logo for the Rate-A-Raider website.

Version Control Systems

Concurrent Versioning System (CVS)
In use by several developers around the world, to maintain their growing and multi-programmer applications. A versioning system can maintain various version of an application allowing users to download from anywhere in the world the version they want. It allows programmers working on different areas of the same file to merge in their changes without much difficulty. CVS has been around for quite some time, as it was one of the first popular vesioning systems. I used to manage most of my projects in CVS, however now use SVN.

Subversion (SVN)
Subversion has been built to replace the age-old CVS. SVN developers made CVS and then added a whole lot of new features. I have set up and currently maintain a subversion server. Before that, I had a CVS server, but I stopped using it. Have ample experience with versioning systems.

System Administration


I have been using Windows since it was DOS! Ok, thats not entirely true, but the first version of windows was akin to a gui pasted on top of DOS. I was quite an expert with DOS; the no. of viruses that were hitting systems those days, you had to be an expert to survive! Then came Windows and I have been working with Windows ever since. Real Windows administration is all about hacking your machine -- get into the registry, find and change cryptic settings, locate tools on your windows machine that microsoft does not want you to know about, unless you are an expert, etc. I have done my share of the above, and am at ease with any Windows environment. While Windows provides simplicity by obscurity, it is one of the simplest OSs to use, if you don't want to do much.

I was the Network Assistant at the Institute of Technology & Management from 5/2001 to 5/2002 managing networked computers in the Advanced Computer Lab.


I have been administering Linux systems for a very long time now. I cannot consider myself an expert in managing Linux systems because with Linux, you will never know everything -- there is something new added everyday! But if there's something that needs to be done, I know I can probably take care of it.

Well, over time, I have set up and regularly maintain several servers:
Network File System (NFS), HTTP (Apache), FTP (vsftpd & pure-ftpd), Samba, Blogger (roller), Jabber (jabberd2), Oracle DB Server, MySql Server, PostgreSQL Server, MP3 Server (using some open-source php scripts and a lot of self-written php code), ...

Other technologies that I have worked with include:
Iptables, Package management with RPM, SELinux, advanced bash scripts, init.d/rc.d, NCurses, CDialog, VI, and a lot more that would just fill up this page.

I also run VmWare on my Windows machine so that I have easy access to Linux, even when I do not.

Master's Thesis: Monte Carlo Localization for Robots using Dynamically Expanding Occupancy Grids

The past few years have seen tremendous growth in the research areas of Mobile Robotics. While growth has been fast and several problems have been very splendidly solved most mobile roboticists are faced with two primary challenges: how will the robot gather information about its environment and how will it know where it is? These two problems are referred to as:
(i). Mapping and
(ii). Localization.

Mapping is the process whereby a robot can extract relevant information from its environment allowing it to ”remember” it. Localization is the problem of estimating a robot’s pose relative to a map of its environment. However, both these problems are computationally intensive to solve and furthermore, limitations on a robot’s on board computational abilities and inaccuracies in sensor hardware and motor effectors make it even harder. Most mapping techniques are limited by memory and hence a robot has a limitation on the area that it can directly map. Also, if the mapped area is extended, most mapping implementations require that the mapping parameters be changed and the entire mapping algorithm be executed again. However, in recent times a new mapping technique was explored which is that of using Dynamically Expanding Occupancy Grids (Ellore 2002), and of using a Centralized Storage System (Barnes, Quasny, Garcia, and Pyeatt 2004). By using this approach, the robot has virtually unlimited storage space, limited only by the hard drive space, and a small initial map which grows as the robots explores its environment.

Localization has not yet been attempted using Dynamically Expanding Occupancy Grids and a Centralized Storage System. This research was geared towards implementing Monte-Carlo Localization methods (Fox, Burgard, Dellaert, and Thrun 1999; Dellaert, Fox, Burgard, and Thrun; Thrun, Fox, Burgard, and Dellaert 2001; Fox, Thrun, Burgard, and Dellaert 2001) for robots using Dynamically Expanding Occupancy Grids. By using this approach this research aimed to provide a complete mapping and localization implementation for robots using dynamically expanding occupancy grids and a centralized storage system.

thesis defense

Mailing Lists:

C-JDBC Users


Freemarker Users freemarker-user@lists.sourceforge.net
Jetspeed Portals Users jetspeed-user@portals.apache.org
Log4j Users log4j-user@logging.apache.org
MspGcc Users mspgcc-users@lists.sourceforge.net
Struts Users user@struts.apache.org
Subversion Users users@subversion.tigris.org
TinyOS Help tinyos-help@Millennium.Berkeley.EDU
TinyOS Users tinyos-users@Millennium.Berkeley.EDU
Tomcat Users tomcat-user@jakarta.apache.org
Portlets Group portlets@yahoogroups.com