Software Development Industry News

Industry News

The legacy developer's guide to Java 9

The legacy developer's guide to Java 9

Wayne Citrin TechBeacon Wednesday, 11 January 2017

Every few years, when a new version of Java is released, the speakers at JavaOne tout the new language constructs and APIs, and laud the benefits. Meanwhile, excited developers line up, eager to use the new features. It’s a rosy picture—except for the fact that most developers are charged with maintaining and enhancing existing applications, not creating new ones from scratch.

Most applications, particularly commercially sold ones, need to be backward-compatible with earlier versions of Java, which won’t support those new, whiz-bang features. And, finally, most customers and end users, particularly those in enterprises, are cautious about adopting the newly announced Java platform, preferring to wait until they’re confident that the new platform is solid.

This leads to problems when developers want to use a new feature. Do you like the idea of using default interface methods in your code? You’re out of luck if your application needs to run on Java 7 or earlier. Want to use the java.util.concurrent.ThreadLocalRandomclass to generate pseudo-random numbers in a multi-threaded application? That's a no-go if your application needs to run on Java 6, 7, 8 or 9.

When new releases come out, legacy developers feel like kids with their noses pressed up against the window of the candy store: They’re not allowed in, and that can be disappointing and frustrating.

So is there anything in the upcoming Java 9 release that’s aimed at developers working on legacy Java applications? Is there anything that makes your life easier, while at the same time allowing you to use the exciting new features that are coming out next year? Fortunately, the answer is yes.

What legacy programmers could do before Java 9

You can shoehorn new platform features into legacy applications that need to be backward-compatible. Specifically, there are ways for you to take advantage of new APIs. It can get a little ugly, however.

You can use late binding to attempt to access a new API when your application also needs to run on older versions of Java that don’t support that API. For example, let’s say that you want to use the class introduced in Java 8, and you want to use LongStream’s anyMatch(LongPredicate) method, but the application has to run on Java 7. You could create a helper class as follows:

public classLongStreamHelper {
     private static Class longStreamClass;
     private static Class longPredicateClass;
     private static Method anyMatchMethod;

     static {
          try {
               longStreamClass = Class.forName("");
               longPredicateClass = Class.forName("java.util.function.LongPredicate");
               anyMatchMethod = longStreamClass.getMethod("anyMatch", longPredicateClass):
          } catch (ClassNotFoundException e) {
               longStreamClass = null;
               longPredicateClass = null;
               anyMatchMethod = null
          } catch (NoSuchMethodException e) {
               longStreamClass = null;
               longPredicateClass = null;
               anyMatchMethod = null;

          public static boolean anyMatch(Object theLongStream, Object thePredicate) 
               throws NotImplementedException {
               if (longStreamClass == null) throw new NotImplementedException();

               try {
                    Boolean result 
                         = (Boolean) anyMatchMethod.invoke(theLongStream, thePredicate);
                    return result.booleanValue();
               } catch (Throwable e) { // lots of potential exceptions to handle. Let’s simplify.
                     throw new NotImplementedException();

There are ways to make this a little simpler, or more general, or more efficient, but you get the idea.

Instead of calling theLongStream.anyMatch(thePredicate), as you would in Java 8, you can call LongStreamHelper.anyMatch(theLongStream, thePredicate) in any version of Java. If you’re running on Java 8, it’ll work, but if you’re running on Java 7, it’ll throw a NotImplementedException.

Why is this ugly? Well, it can get extremely complicated and tedious when there are lots of APIs you want to access. (In fact, it’s tedious already, with a single API.) It’s also not type safe, since you can’t actually mention LongStream or LongPredicate in your code. Finally, it’s much less efficient, because of the overhead of the reflection, and the extra try-catch blocks. So, while you can do this, it’s not much fun, and it’s error-prone if you're not careful.

While you can access new APIs and still have your code remain backward-compatible, you can’t do this for new language constructs. For example, let’s say that you want to use lambdas in code that also needs to run in Java 7. You’re out of luck. The Java compiler will not let you specify a source compliance level later than its target compliance level. So, if we set a source compliance level of 1.8 (i.e., Java 8), and a target compliance level of 1.7 (Java 7), it will not let you proceed.

Multi-release JAR files to the rescue

Until recently, there hasn’t been a good way to use the latest Java features while still allowing the application to run on earlier versions of Java that didn't support the applications. Java 9 provides a way to do this for both new APIs and for new Java language constructs: It's called multi-release JAR files.

Multi-release JAR files look just like old-fashioned JAR files, with one crucial addition: There’s a new nook in the JAR file where you can put classes that use the latest Java 9 features. If you’re running Java 9, the JVM recognizes this nook, uses the classes in that nook, and ignores any classes of the same name in the regular part of the JAR file.

If you’re running Java 8 or earlier, however, the JVM doesn’t know about this special nook. It will ignore it, and only run the classes in the regular part of the JAR file. When Java 10 comes out, it will offer another nook specifically for classes using new Java 10 features, and so forth.

JEP 238, the Java enhancement proposal that specifies multi-release JAR files, gives a simple example. Consider a JAR file containing four classes that will work in Java 8 or earlier:

JAR root

      - A.class
      - B.class
      - C.class
      - D.class

Let’s say that Java 9 comes out, and you rewrite classes A and B to use some new Java 9-specific features. Later, Java 10 comes out and you rewrite class A again to use Java 10’s new features. At the same time, the application should still work with Java 8. The new multi-release JAR file looks like this:

JAR root
      - A.class

      - B.class
      - C.class
      - D.class
      - META-INF


                - 9

                   - A.class
                   - B.class

               - 10
                   - A.class

In addition to the new structure, the JAR file’s manifest contains an indication that this is a multi-release JAR.

When you run this JAR file on a Java 8 JVM, it ignores the \META-INF\Versions section, since it doesn’t know anything about it and isn’t looking for it. Only the original classes A, B, C and D are used.

When you run it using Java 9, the classes under \META-INF\Versions\9 are used, and are used instead of the original classes A and B, but the classes in \META-INF\Versions\10 are ignored.

When you run it using Java 10, both \META-INF\Versions branches are used; specifically, the Java 10 version of A,  the Java 9 version of B, and the default versions of C and D are used.

So, if you want to use the new Java 9 ProcessBuilder API in your application while still allowing your application to run under Java 8, just put the new versions of your classes that use ProcessBuilder in the \META-INF\Versions\9 section of the JAR file, while leaving old versions of the classes that don’t use ProcessBuilder in the default section of the JAR file. It’s a straightforward way to use the new features of Java 9 while maintaining backward compatibility.

The Java 9 JDK contains a version of the jar.exe tool that supports creating multi-release JAR files. Other non-JDK tools also provide support.

Java 9: Modules everywhere

The Java 9 module system (also known as Project Jigsaw), is undoubtedly the biggest change to Java 9. One goal of modularization is to strengthen Java’s encapsulation mechanism so that the developer can specify which APIs are exposed to other components, and can count on the JVM to enforce the encapsulation. Modularization’s encapsulation is stronger than that provided by the public/protected/private access modifiers of classes and class members.

The second goal of modularization is to specify which modules are required by which other modules, and to ensure that all necessary modules are present before the application executes. In this sense, modules are stronger than the traditional classpath mechanism, since classpaths are not checked ahead of time, and errors due to missing classes only occur when the classes are actually needed. That means that an incorrect classpath might be discovered only after an application has been run for a long time, or after it has run many times.

The entire module system is large and complex, and a complete discussion is beyond the scope of this article. (Here's a good, in-depth explanation.) Rather, I'll concentrate on aspects of modularization that support legacy application developers.

Modularization is a good thing, and developers should try to modularize their new code wherever possible, even if the rest of the legacy application is not (yet) modularized. Fortunately, the modularization specification makes this easy.

First, a JAR file becomes modularized (and becomes a module) when it contains a file module-info.class (compiled from at the JAR file root. contains metadata specifying the name of the module, which packages are exported (i.e., made visible to the outside), which modules the current module requires, and some other information.

The information in module-info.class is only visible when the JVM is looking for it, which means that modularized JAR files are treated like ordinary JAR files when running on older versions of Java (assuming the code has been compiled to target an earlier version of Java. Strictly speaking, you’d need to cheat a little and still target module-info.class to Java 9, but that’s doable).

That means that you should still be able to run your modularized JAR files on Java 8 and earlier, assuming that they’re otherwise compatible with that earlier version of Java. Also note that module-info.class files can be placed, with restrictions, in the versioned areas of multi-release JAR files.

In Java 9, there is both a classpath and a module path. The classpath works like it always has. If a modularized JAR file is placed in the classpath, it’s treated just like any other JAR file. This means that if you’ve modularized a JAR file, but are not ready to have your application treat it as a module, you can put it in the classpath, and it will work as it always has. Your legacy code should be able to handle it just fine.

Also, note that the collection of all JAR files in the classpath are considered to be part of a single unnamed module. The unnamed module is considered a regular module, but it exports everything to other modules, and it can access all other modules. This means that, if you have a Java application that’s modularized, but have some old libraries that haven’t been modularized yet (and perhaps never will be), you can just put those libraries in the classpath and everything will just work.

Java 9 contains a module path that works alongside the classpath. Using the modules in the module path, the JVM can check, both at compile time and at run time, that all necessary modules are present, and can report an error if any are missing. All JAR files in the classpath, as members of the unnamed module, are accessible to the modules in the module path and vice versa.

It’s easy to migrate a JAR file from the classpath to the module path, to get the advantages of modularization. First, you can add a module-info.class file to the JAR file, then move the modularized JAR file to the module path. The newly minted module can still access all the classpath JAR files that have been left behind, because they’re part of the unnamed module, and everything is accessible.

It’s also possible that you might not want to modularize a JAR file, or that the JAR file belongs to someone else, so you can’t modularize it yourself. In that case, you can still put the JAR file into the module path; it becomes an automatic module.

An automatic module is considered a module even though it doesn’t have a module-info.class file. The module’s name is the same as the name of the JAR file containing it, and can be explicitly required by other modules. It automatically exports all of its publicly accessible APIs, and reads (that is, requires) every other named module, as well as the unnamed modules.

This means that it’s possible to make an unmodularized classpath JAR file into a module with no work at all: Legacy JAR files become modules automatically, albeit without some of the information needed to determine whether all required modules are really there, or to determine what is missing.

Not every unmodularized JAR file can be moved to the module path and made an automatic module. There is a rule that a package can only be part of one named module. So if a package is in more than one JAR file, then only one of the JAR files containing that package can be made into an automatic module—the other can be left in the classpath and remain part of the unnamed module.

The mechanism I've described sounds complicated, but it’s really quite simple. All it really means is that you can leave your old JAR files in the classpath or you can move them to the module path. You can modularize them or you can leave them unmodularized. And once your old JAR files are modularized, you can leave them in the classpath or put them in the module path.

In most cases, everything should just work as before. Your legacy JAR files should be at home in the new module system. The more you modularize, the more dependency information can be checked, and missing modules and APIs will be detected far earlier in the development cycle, possibly saving you a lot of work.

DIY Java 9: The modular JDK and Jlink

One problem with legacy Java applications is that the end user might not be using the right Java environment. One way to guarantee that the Java application will run is to supply the Java environment with the application. Java allows the creation of a private, or redistributable, JRE, which may be distributed with the application. The JDK/JRE installation comes with instructions on how to create a private JRE. Typically, you take the JRE file hierarchy that’s installed with the JDK, keep the required files, and retain the optional files with the functionality that your application will need.

The process is a bit of a hassle: You need to maintain the installation file hierarchy, you must be careful that you don’t leave out any files and directories that you might need, and, while it does no harm to do so, you don’t want to leave in anything that you don’t need, since it will take up unnecessary space. That's an easy mistake to make.

So why not let the JDK do the job for you?

With Java 9, it’s now possible to create a self-contained environment with your application, and anything it needs to run. There's no need to worry that the wrong Java environment is on the user’s machine, and no need to worry that you’ve created the private JRE incorrectly.

The key to creating these self-contained run-time images is the module system. Not only can you modularize your own code, but the Java 9 JDK is itself now modularized. The Java class library is now a collection of modules, as are the tools of the JDK itself. The module system requires that you specify the base class modules that your code requires, and that in turn will specify the parts of the JDK that you need.

To put it all together, you'll use a new Java 9 tool called jlink. When you run jlink, you’ll get a file hierarchy with exactly what you need to run your application — no more and no less. It will be much smaller than the standard JRE, and it’s platform-specific (that is, specific to an operating system and machine), so if you want to create these runtime images for different platforms, you’ll need to run jlink in the context of installations on each platform for which you want an image.

Also note that if you run jlink on an application in which nothing has been modularized, there won’t be enough information to narrow down the JRE, so jlink will have no choice but to package the whole JRE. Even there, you’ll get the convenience of having jlink package the JRE itself, so you don’t need to worry about correctly copying the required file hierarchy.

With jlink, it becomes easy to package up your application, and everything it needs to run, without worrying about getting it wrong, and only packaging that part of the runtime that’s necessary to run your application. This way, your legacy Java application has an environment on which it’s guaranteed to run.

When old meets new

One problem with having to maintain a legacy Java application is that you’re shut out of all the fun when a new version of Java comes along. Java 9, like its predecessors, has a bunch of great new APIs and language features, but developers, remembering past experiences, might assume that there’s no way to use those new features without breaking compatibility with earlier versions of Java.

Java 9’s designers, to their credit, seem to have been aware of this, and they’ve worked hard to make those new features accessible to developers who have to worry about supporting older versions of Java.

Multi-release JAR files allow developers to work with new Java 9 features, and segregate them in a part of the JAR file where earlier Java versions won’t see them. This makes it easy for developers to write code for Java 9, leave the old code for Java 8 and earlier, and allow the runtime to choose the classes it can run.

Java modules let developers get better dependency checking by writing any new JAR files in a modular style, all the while leaving old code unmodularized. The system is remarkably tolerant, is designed for gradual migration, and will almost always work with legacy code that knows nothing about the module system.

The modular JDK and jlink let users easily create self-contained runtime images so that an application is guaranteed to come with the Java runtime that it needs to run, and everything that it needs is guaranteed to be there. Previously, this was an error-prone process, but in Java 9 the tools are there to make it just work.

Unlike earlier Java releases, the new features of Java 9 are ready for you to use, even if you have an older Java application and need to ensure that customers can run your application—regardless of whether or not they’re as eager as you are to move up to the newest Java version.