A Short Essay on the Java Virtual Machine 2013-09-11 23:06:35 +0100 +0100

To run a Java Program, you need a Java Virtual Machine. What is a Virtual Machine? Why is it necessary to have a Java Virtual Machine? Why is it useful that Java works this way? Java, originally developed by Sun Microsystems (now owned by Oracle) is one of the most popular programming languages today. The idea (that started back in 1991) was the belief that “the next wave in computing was the union of digital consumer devices and computers.

To run a Java Program, you need a Java Virtual Machine. What is a Virtual Machine? Why is it necessary to have a Java Virtual Machine? Why is it useful that Java works this way?

Java, originally developed by Sun Microsystems (now owned by Oracle) is one of the most popular programming languages today. The idea (that started back in 1991) was the belief that “the next wave in computing was the union of digital consumer devices and computers.” (Oracle, 2010). Back then, most devices were isolated and therefore mostly incompatible with other devices; so, the idea here was to unite devices through the use of virtual machines, making them able to interact with each other through a common system.

A standard definition of a Virtual Machine can be ‘the apparent machine that the operating system presents to the user, achieved by hiding the complexities of the hardware behind layers of operating software’ (Bond & Langfield, 2009, p. 140). However, this definition would be more commonly used to describe the role of an Operating System on a computer like Microsoft’s Windows or Apple’s Macintosh. The Java Virtual Machine (JVM) follows the same model, but it adds another layer of software between the hardware and the user. From my understanding, a Virtual Machine in the JVM context can be expressed as a machine within a machine, where an operable machine (the JVM) uses resources from a host operating machine (for example Microsoft’s Windows) to execute a Java program. A more appropriate definition for a Java Virtual Machine would therefore be ‘the apparent machine that the operating system presents to the user, achieved by hiding the complexities of the host operating system behind another operating machine’.

As defined above, one of the roles of an operating system is to hide the complexities of hardware from the user. Developers use an Application Programming Interface (API) when creating programs. An API is a layer of software that allows application programs to call on the services of the operating system (Bond & Langfield, 2009, p. 140). My interpretation of this is if a developer wanted to create a file on the hard disk; instead of manually writing the code to create a file from scratch, they can use the Microsoft Windows API (assuming that they’re creating the program for Microsoft Windows) which would create the file using conventional methods. Today, there are many operating systems which consumers can use, however each of those operating systems possess different APIs and manage system resources differently. For example, although the 3 most common operating systems (Microsoft Windows, Apple Macintosh and Linux) all use different file systems for storing data on a hard drive. Windows favours NTFS, Linux has ext4 and Macintosh uses HFS plus. All of these file systems represent different methods of data storage on a hard disk. This creates a large problem for developers as they would have to redevelop their program for each individual operating system, which would be very time-consuming and cost-inefficient which would add to a business’ costs. Java overcomes this obstacle with the JVM, allowing developers to write programs once and have the ability to run on many other operating systems as long as they have the Java Virtual Machine. This is one of Java’s major selling points, the ‘write once, run anywhere’ feature.

Matt Curtin’s article on this main feature of Java gives a detailed insight to what Java held for the future was anticipated to be while Java was still in its early stages. This source could be considered opinionated because although it was written by a professor at the Ohio State University (Department of Computer Science and Engineering at Ohio State University, 2012), it is found under a rants folder under his corporation website which he is the founder of. However, he does explain why ‘write once, run anywhere’ is good news for both IT managers and end users and that people should be less content with bugs in software as after all consumers are paying for it and that you wouldn’t find it acceptable for your car to break down and need ‘restarting’ every few minutes (Curtin, 1998, p. 2). However, it can be argued that there is no such thing as a perfect program because bugs will always exist within software. This can be because of limitations of the programming language or the fact that physical hardware cannot be fully dependable due to the nature of electrical connections being imperfect.

Every day, ubiquitous computing is becoming a more and more real part of our environment. For this, devices have to be compatible with each other. With Java, devices of a different nature can interact with each other easily because programs will run on homogenous virtual machines on both of the devices. You now see Java on over 4 billion devices and some devices (for example, Blu-ray players) are exclusively powered by Java. (Oracle, 2011). It’s quite clear from this that Java has successfully been endorsed by many large corporations, which is helping Java to become more and more successful. “Java is the result of lots of good ideas from different programming languages coming together, in such a way that programmers can do what they need to do without need to jump through hoops and circles.” (Curtin, 1998, p. 4). From this, we could conclude that there has been successful innovation from the birth of Java, which after all, benefits the future of computing.

Bibliography