jasode 16 hours ago

The word "hype" is being used in 2 different ways.

definition #1 is about Java features : The original "Java is criminally underhyped" essay by Jackson Roberts is talking about "not over-hyped" in terms of Java's technical capabilities ... such as types and package manager, etc. E.g. Java has types which Javascript/Python do not and typing is a positive thing to help prevent errors -- therefore -- Java is "underhyped". The particular language capability not being used as much as the author thinks it should is the basis for defining what "hype" is.

definition #2 is about Java's marketplace effect: The "overhype" of Java in the 1990s was extrapolating and predicting Java's effect on the whole computing landscape. This type of "hype" is overestimating the benefits of Java and making bold proclamations. Examples:

- Java and JVM's WORA "Write Once Run Anywhere" will kill Microsoft's Evil Empire because it will render Windows irrelevant. (This didn't happen and MS Windows still has 70+% desktop market share today. 30 years later and Microsoft is one of the top 3 tech companies with a $3+ trillion dollar market cap while Sun Microsystems was acquired at a discount by Oracle.)

- Java will make lower level languages with manual memory allocation like C/C++ obsolete because CPUs are getting faster. Let the extra "unused" cpu cycles do automatic garbage collection in Java instead of C programmers manually managing memory with malloc()/free(). (C/C++ is still used today for games, and tight loops of machine learning libs underneath Python.)

- Java plugins will enable "rich" web experiences. (It turns out that Javascript and not Java plugins won the web. Java also didn't win on desktop apps. Javascript+Electron is more prevalent.)

That's the type of overhype that Java failed to deliver.

Same situation with today's AI. Some aspects of AI will absolutely be useful but some are making extravagant extrapolations (i.e. "AI LLM hype") that will not come true.

  • le-mark 16 hours ago

    > Java and JVM's WORA "Write Once Run Anywhere" will kill Microsoft's Evil Empire because it will render Windows irrelevant.

    There was the sentiment but this doesn’t fully capture what the hype was about. Java out of the gate had Suns network vision built in via jndi, rmi and object serialization. The hype was about moving applications onto the network and off of windows or any particular vendors OS.

    And this did come to pass, just not how Sun was selling it with Java. For example; Office, Microsoft’s crown jewel and anchor into a lot organizations is now almost entirely web based.

    • hedora 13 hours ago

      It didn’t really come to pass.

      Office web is comically slow, even when I have $5K of machines a 100ish GB of DRAM laying around my house.

      In the java vision, it’d transparently offload to that network of machines. Also, the user could decide not to offload to untrusted hardware (e.g., I don’t want to trust Microsoft’s cloud compute).

    • breve 8 hours ago

      > And this did come to pass, just not how Sun was selling it with Java. For example; Office, Microsoft’s crown jewel and anchor into a lot organizations is now almost entirely web based.

      But this will come full circle with the transition to WebAssembly. WebAssembly is like Java applets but it has the advantage of being built into the browser with nothing extra to install.

      Google switched the calculation engine of Google Sheets to WebAssembly (rewriting it in Java as it happens) and got double the performance versus JavaScript:

      https://web.dev/case-studies/google-sheets-wasmgc

    • zozbot234 15 hours ago

      > Java out of the gate had Suns network vision built in via jndi, rmi and object serialization.

      It's kind of obvious since having a standard, platform-neutral virtual machine doesn't just enable WORA; it enables sending binary code and program state over the network, which is quite handy for all sorts of distributed computing flows. We'll probably do the same things using Wasm components once that tech stack becomes established.

    • ndiddy 14 hours ago

      I think the "move applications onto the network" idea basically killed Java on the desktop and relegated it to a backend only language. It wasn't a bad idea, but it was too early. Because of the focus on network distribution, the two main ways to distribute desktop Java software were Java WebStart (lengthy start-up times on early 2000s internet speeds, integrated poorly with your OS) and applets (only viable for corporate environments where the IT department had control over the whole network and all the clients on it due to security problems, no integration with your OS). If you had to distribute software that ran locally, you had to roll your own solution or buy some third-party tool since Sun didn't have anything that made this easy.

  • zozbot234 16 hours ago

    > Java will make lower level languages with manual memory allocation like C/C++ obsolete because CPUs are getting faster.

    Except that this actually happened wrt. a whole lot of application code. Sure, Java was slow and clunky but at least it was free of the memory unsafety that plagued C/C++. What was the mainstream "safe" alternative? There was no Rust back then, even Cyclone (the first memory-safe C style language) was only released in the mid-2000s.

    • jasode 15 hours ago

      >Sure, Java was slow and clunky but at least it was free of the memory unsafety that plagued C/C++. What was the mainstream "safe" alternative? There was no Rust back then,

      Before Sun Java in 1995, companies built enterprise CRUD apps with "safe" memory languages using MS Visual Basic, PowerBuilder, and xBase languages like dBASE and FoxPro. This allowed them to develop line-of-business apps without manually managing memory in C/C++.

      • hedora 13 hours ago

        On top of that, when I last had the misfortune of using Java, (17, maybe?) the GC still was a top cause of production outages and a massive productivity drag.

        They claimed they fixed it a dozen times, starting with JDK 1.4, and continued to claim that with every major release since.

        • gf000 7 hours ago

          Top cause for production outages? Like what, did you get OOM errors? Also, how can a GC be a productivity drag?

          Anyways, besides your very questionable "experience" with java, it literally runs half of the internet, and has by far the most mature and performant GC out of any platform, no one else is even close in this category.

        • holowoodman 8 hours ago

          Java might be memory-safe. But it is "performance-unsafe".

          • Tostino 6 hours ago

            For certain problems. I'm looking forward to Project Valhalla finally landing, and the subsequent improvements that come later.

            Getting the performance benefits of data oriented programming patterns is very exciting to me for allowing Java to run totally different classes of applications than haven't been practical until now.

    • majormajor 9 hours ago

      Java wasn't particularly responsible for that. And non-GC languages didn't even become obsolete - they're so relevant that they're one of the buzziest new-language-development areas.

      It didn't displace C-family languages on desktop OSes. There was a brief wave of Java-first desktop apps and people generally hated the UI toolkits and lack of first-party-feel and they went away.

      And on the server, it was just one of many non-C things like Perl, PHP, Python, Ruby, etc. Java became a standard "our dynamic language is too slow, port it to something else" destination but honestly the interpreted languages did more for killing C/C++ from a perspective of "CPUs are fast enough, and we're not CPU bound usually, just write in whatever's easiest."

      And now we have a whole wave of "rewrite something off of the JVM, or off of Go, or off of some other GC'd language in Rust" projects because "just let the CPU use the spare cycles to GC has never been fully realized.

      • gf000 7 hours ago

        > And now we have a whole wave of "rewrite something off of the JVM, or off of Go, or off of some other GC'd language in Rust" projects

        I think you vastly overestimate the size of that wave. Rust is a novelty in a specific niche and is changing/will change the low-level programming niche, but it won't noticeably alter how general business programming goes, not at all.

    • gompertz 15 hours ago

      Interesting! I never heard of Cyclone before. Looks like another Bell Labs contribution.

  • pyuser583 3 hours ago

    Java massacred COBOL and Ada. It was “the” web programming language for a while.

whobre 17 hours ago

It was ridiculous. They seriously wanted to rewrite everything in Java, including office and web browsers. It was 10 times worse than the recent “rewrite in Rust” mania and way more unrealistic.

  • hodgesrm 14 hours ago

    > It was ridiculous. They seriously wanted to rewrite everything in Java, including office and web browsers.

    There's another perspective. Many people were looking for something like Java well before it was released: VM-based, portable, modern object-orientation features, etc.

    Case in point: databases. In the early 1990s I worked on a project at Sybase that attempted to rewrite SQL Server from the ground up to bring in object [relational] support. The team focused on VM-based languages as a foundation, which were an area of active academic research at the time. Built-in object support, portability, and ability to support code generation for queries were among the attractions. The project started with Smalltalk (slow!), then moved to an acquired VM technology (it was bad!), and finally a VM we designed and built ourselves. These gyrations were a key reason why the project failed, though not the only one.

    When Java came out in 1995--I got access to the alpha release in September--it met virtually every requirement we were trying to fulfill. At that point most attempts to build new databases on other VM tech became instantly obsolete. (Other vendors were also looking at VMs as well.)

    Not coincidentally Nat Wyatt and Howard Torf, a couple of key engineers from our project, founded a start-up called Cloudscape to pursue the Java route. They created the database we know today as Derby.

    Somewhat more coincidentally, Java became dominant in American DBMS development after 2000. Examples including Hadoop, Druid, Pinot, HBase, etc., are just a few of the examples. I say "somewhat more concidentally" because at that point most of us saw Java as simply more productive than C/C++ alternatives for building reliable, high-performance, distributed systems. That view has obviously evolved over time, but between JIT, dev tooling, and libraries it was definitely true in the early 2000s. It helps to remember how difficult C++ was to use at that time to understand this perspective.

    In summary, a lot of the hype was the usual new technology craziness, but Java also met the needs of a population of people that went far beyond databases. There was a basis for our excitement. Just my $0.02.

    Edit: typo

  • II2II 15 hours ago

    Two points:

    The hype accomplished something that would be otherwise impossible: it established Java as a language in what was likely record time. Consider another popular language: Python. It was created about 5 years earlier, yet it rose to prominence about a decade later. Or consider Rust. It is, in many respects, as significant as Java. While major developers were shipping significant applications written in Java within 5 years, Rust is only creeping into important software a decade after its introduction.

    The second point is its easy to underestimate the dominance of Microsoft in those days. You think that Microsoft is dominant today, well, that's nothing compared to the late 1990's. Microsoft's market share was closer to 95% for the PC market. The workstation market was starting to crumble due to competition from Microsoft and Intel. About the only thing that was safe were mainframes, and that was highly dependent upon one's definition of safe. Nearly everyone who was competing against Microsoft wanted to see a chunk taken out of them, which meant pretty much everyone since Microsoft had its fingers in so many markets. And, as it turns out, nearly everything did have to be rewritten. Sometimes it was to deliver SaaS over the Internet and sometimes it was to target mobile devices.

    • zozbot234 15 hours ago

      If I had to pick a language that's "as significant as Java", I'd pick Golang way before Rust - and Golang has found significant success. The first genuinely usable version of Rust was only out in late 2018, so it's way too early to argue about its slow and "creeping" adoption curve.

      > The second point is its easy to underestimate the dominance of Microsoft in those days. You think that Microsoft is dominant today, well, that's nothing compared to the late 1990's. Microsoft's market share was closer to 95% for the PC market.

      By the late 1990s Linux had become a viable platform for a whole lot of things, and people were beginning to take notice. Most obviously, that probably put a big dent into the adoption of Windows NT as a server OS on x86 machines, which had been progressing quite well until the mid 1990s. That also probably helped Java because it meant you could seamlessly run your server workloads on "toy" x86 machines or on more "serious" platforms, without changing anything else.

  • klntsky 17 hours ago

    RIIR is justified in most of the cases because in the past the only reason to use a GCd language was memory safety for most of the apps.

    RIIJ was justified too, because people believed the web will end up being java applets all the way down.

    • cenamus 16 hours ago

      And RIIJ also gives you memory safety

      • throw0101c 16 hours ago

        > And RIIJ also gives you memory safety

        I think Java helped in the mainstreaming of memory safe and GC languages, especially in more corporate spaces where C/C++ was still mostly still a thing.

        Certainly sysadmins were using a lot of Perl during that time, but for "real" enterprise software, I don't think dynamic-ish languages were as accepted. The use of Perl and rise of Python widened the Overton window.

      • mdaniel 16 hours ago

        And plausibly sandboxing, too, since the JVM used to carry a policy language with it that allowed granting access by package (or by public key, IIRC), a vestige from its days of running in the browser. But they recently killed that whole feature due to disuse

  • epcoa 17 hours ago

    Well all the parts that you didn’t write in XML and XSLT, etc.

  • burnt-resistor 8 hours ago

    I remember the Java hype c. 1999. It was such jumping the shark ultrahype of the platform, the language, and OOP. The one thing it did though, it allowed me to produce an installer for a niche DOS-based building code text search engine and JNI allowed calling the functions to produce a Windows 95/NT desktop shortcut (a .PIF). I even added a feature where it was per-customer skinnable with a .BMP.

  • Disposal8433 16 hours ago

    I remember some greybeards hyping the CPUs that could run Java bytecode (it existed for a short time). I was a junior C++ fanboy at the time and I already knew that they were wrong.

    • brabel 15 hours ago

      Were you right because you knew something they didn't, or you were just as irrational (maybe more given you were "a junior"?) but got lucky in being stuck with an opinion that eventually turned out right?

      • holowoodman 8 hours ago

        Everyone with lower-level systems programming or hardware experience did know after taking a look that Java bytecode was unfit for direct execution by a CPU. All the bits that you need to make decoding and execution fast in hardware were just missing. And the instruction set is complex in a way that makes the most complex CISC instructions look almost like RISC. You basically would have to put another decoder stage into processors, to make java bytecode into CISC and that into RISC.

        Only applications and web people did believe that particular part of the hype, because they didn't know what it took. And IBM consultants believed it because IBM always managed to sell some weird CPU extension for big dollars that did weird stuff, like their "decode XML extension".

        • wpollock 5 hours ago

          Unfix for an x86 CPU. But that architecture wasn't as locked-in back then. For example, the Intel 432 CPU was designed to support OO languages. I believe it was designed with Ada in mind, and with GC support and other VM support. Java Bytecode would probably have run well on it.

  • jmyeet 14 hours ago

    There are design decisions you can reasonably question in Rust but the big one that justifies its existence is memory safety. It's simply too important. Not everything needs it but key infrastructure, most notably Web browsers, do.

    I predict we will be having buffer overrun CVEs in C/C++ code for as long as we have C/C++ code.

    The realities of writing safe, multithreaded C/C++ on processors with out-of-order processing, context switching and branch prediction is simply too complex to get right 100% of the time. Rust makes writing certain code difficult because it is difficult to do/ C/C++ fools you into believing something is safe because you've never encountered the circumstances where it isn't.

    We have tools like valgrind to try and identify such issues. They're certainly useful. But you'll be constantly chasing rabbits.

    I've seen thread and memory bugs in production code written by smart, highly-paid engineers at big tech companies that have lain dormant for the better part of a decade.

    That's why Rust exists.

    • holowoodman 8 hours ago

      Problem is, unsafe rust is as crappy as C/C++ safety-wise, just with worse tooling. And lots of things that C/C++ does can't be done without unsafe rust.

      Rust will never take over and something like C/C++ will always be with us.

  • giantrobot 15 hours ago

    I was excited about ApplixWare Anyware Office[0] around 1999-2000. I'm pretty sure I got a copy bundled in a boxed copy of SuSE or RedHat. It was the first time I'd really seen a real productivity application written as an applet. It was an interesting idea that was eventually delivered by JavaScript.

    [0] https://www.applix.com/applixware/anywareoffice.html

  • more_corn 16 hours ago

    Agreed. I lived through it and it was seriously overhyped.

vanschelven 16 hours ago

Missing from the article — which is funny, considering it's written from the perspective of a university student — is how deliberate Sun’s academic strategy was. In 1998 they launched the "Authorized Academic Java Campus" program, licensing Java tech to universities and setting up official training centers. Even before that, they were offering Java tools free to schools for teaching and research.

Combined with a massive branding push — Sun doubled its ad spend from 1995 to 1997 — Java ended up everywhere in CS education. By the late ’90s, first-year courses using Java weren’t a coincidence; they were the result of a planned, top-down push.

  • ivanbalepin 8 hours ago

    > they were offering Java tools free to schools for teaching and research

    This is also underrated considering there once was an era when you had to pay a lot of money to use a compiler (or almost any software, really) and had to pay a lot of money to access documentation (!) oh what a crazy world it was.

  • nailer 15 hours ago

    I’m a professional programmer now, but I was not in the early 2000s and I remember looking at my girlfriend’s computer science book. I saw the unnecessary boiler plate and leaky abstractions of 90s style OOP and wasn’t sure if I was wrong or ‘serious’ programming had become insane. It was nearly a decade before I worked out the Java had nothing to do with the Alan Kay’s original concept of OOP and the rest of the industry started to abandon it.

    • ndiddy 11 hours ago

      There's a really good talk about the history of OOP going back to the early 60s that goes over this point. The whole "90s style OOP isn't OOP as it was originally intended" thing is largely a myth. When Kay and other OOP pioneers were describing Smalltalk and OOP to professional developers, they all used 90s Java style "compile-time hierarchy of encapsulation that matches the domain model" OOP to explain it. Smalltalk-80: The Language, a book by some of the original designers of Smalltalk, even uses the same shape class inheritance example you'll see in every introductory Java textbook. Here's a link to the talk, timestamped to when he goes over this. https://youtu.be/wo84LFzx5nI?t=785

      • zozbot234 10 hours ago

        > "compile-time hierarchy of encapsulation that matches the domain model"

        The talk doesn't mention it of course, but this still exists, even in "modern" languages that purport to be free of OOP! What do you think Typestate is - particularly in its genericized variety - other than a kind of "compile-time hierarchy of encapsulation that matches the domain model"? Change my mind: Typestate - and Generic Typestate only more so - is just good old OOP wearing a trenchcoat.

      • nailer 10 hours ago

        Thanks! That’s an excellent talk. I agree with the premise at 11:10.

    • ecshafer 14 hours ago

      Java was not an attempt to get mainstream programmers to LISP or Smalltalk, it was to get them halfway there from C++. It was just to make application software a little nicer and less manual memory management.

      I also dont think we should blame Java the language for the OOP insanity that also infect C++, Delphi, etc. it was an industry wide insanity that thought they could replace pesky programmers with a single god architect.

    • gf000 7 hours ago

      Actors have much more to do with Alan Kay’s original concept of OOP, then anything OOP. It's simply a term that means something completely different by the whole industry, so this should be more of a historical fun fact, then something to live by.

jerf 17 hours ago

You can understand Java hype in 1997 by understanding it as selling the Java of about 2007, but Java of 1997 couldn't deliver. Both because it was a young language, and had all the problems of a young language like poor library support for just about everything, and because the hardware in 1997 wasn't ready to deliver on the hype. Even in 1997 we weren't really looking for web pages to take 60 seconds to "start up", and that could easily happen for a "Java applet" on a home computer. (Or worse, if trying to load the applet pushed the system into swap. In this era 32MB-64MB would be normal amounts of RAM and the OS, other apps, and the browser have already eaten into that quite a bit before Java is trying to start up.) And then it was fairly likely to crash, either the applet itself, or the whole browser process.

And it was just about shoved down our throat. They paid to get it into schools. They paid for ads on TV that just vaguely said something about Java being good, because they didn't really have anything concrete they could point to yet. They paid to have really bad enterprise software written in it and then jammed into schools just to make sure we had a bad experience, like Rational Rose [1]... my memory may be failing me but I think it was implemented in Java at the time, because it was a Swing app (another Java thing shoved down our throats but not ready for prime time even by the standards of 1997). I was using it as an undergrad student in 1999 or so and I couldn't hardly click on a thing without crashing it. Not the best look for Java, though I'm sure it was not Java qua Java's fault.

Still, it fits the pattern I'm trying to show here of it being grotesquely hyped beyond its actual capabilities.

They shoved enough money at it that they did eventually fix it up, and the hardware caught up into the 200xs so it became a reasonable choice. Java isn't my favorite language and I still try to avoid it, but in 2025 that's just a taste and personal preference, not because I think it's completely useless. But I feel bad for anyone in the 1990s ordered by corporate mandate to write their servers in Java because the ads look cool or because Sun was paying them off. It must have been a nightmare.

In fact, you can understand the entire Dot Com era hype as selling the internet of about 2007 in 1997, or in some cases even 2017. It all happened, but it didn't all happen in the "year or two" that the stock valuations implied.

[1]: https://en.wikipedia.org/wiki/IBM_Rational_Rose

  • derriz 15 hours ago

    > Both because it was a young language, and had all the problems of a young language like poor library support for just about everything, and because the hardware in 1997 wasn't ready to deliver on the hype.

    Outside of Perl’s CPAN, library support in 1997 sucked for all languages. Being able to write a hash-table or linked list C was a valuable commercial skill as nearly every single code base would include custom versions of these sorts of very basic data structures rather than pull them from a commonly used library.

    “Using a 3rd party library” meant copying a bunch of source code downloaded from who-knows-where into your source control repo and hacking it to work with whatever funky compiler and/or linker your project used.

    • jerf 15 hours ago

      I mean even by the standard of the time, though. The Java hype meant that if a UI wasn't written in Java, it sucked, so everything had to use the Java UI. But even as young as Windows still was at the time, the UI toolkits were much more developed than the Java ones. The Java ones looked like they were written to a bullet-point list of the minimal features they needed to shove them out the door, in a new language nobody knew, which is probably because they were. As with Rational Rose, even as a student I could ram straight into a brick wall of missing features every direction I turned. I can only imagine what a professional of that era had to deal with. Compare that with a modern student, where they may still not know how to do a given thing but their main problem is that they don't know how to find or evaluate the hundreds of options that exist.

      I know that it wasn't like it is today where a casual weekend hobby project can easily pull in a few hundred libraries with just a couple of shell commands, but you still needed some things to get going. It was theoretically possible to sit down with a blank text editor and write assembly code that functioned as a GUI app, the last few dying gasps of that philosophy were around, but it's not what most people did.

      • derriz 13 hours ago

        Well that’s a more specific criticism - that AWT sucked. I won’t defend Java GUIs from the period having worked on a 1990s Java AWT based GUI application. Although, SWT apps like eclipse were decent enough.

        I think in the context of the time, Java was simply following the common least-common-denominator approach, common for cross-platform GUI toolkits at the time - tcl/tk was hot at the time as were commercial products like powerbuilder.

        This approach was to only include features/funtionality which mapped directly to native for all supported platforms - Mac, windows and some X11 toolkit. There were convincing arguments for this approach at the time - you got native look and behavior from the same code for all platforms Java ran on, like with TK but it quickly came apparent that this approach was a technological cul-de-sac limited to simple dialogs and form-based GUIs.

  • leoc 13 hours ago

    > and had all the problems of a young language like poor library support for just about everything

    I know you didn't say otherwise, but for anyone who wasn't there it should be emphasised that many of the deficiencies were core-language deficiencies not just library issues. Java people would blame the customers https://people.csail.mit.edu/gregs/ll1-discuss-archive-html/... https://people.csail.mit.edu/gregs/ll1-discuss-archive-html/... and a rush to market http://www.blinkenlights.com/classiccmp/javaorigin.html (in a self-congratulatory way, of course), but it's also pretty clear that the Java team itself had significantly overestimated how capable and sufficient core Java 1 was. In fact writing out the standard library was evidently an important learning experience there, one which gave birth to the Java-Hater's Handbook https://wiki.c2.com/?EffectiveJava . And before things eventually got better there was ofc lots of hype first about how Java was a shining jewel of minimalism, and then about how it was a fine language for plain everyday people who had no truck with fancy abstractions.

bane 5 hours ago

I remember this time well. For hacker types, Java was...not what we were looking for. But it checked a lot of boxes for the type of bland corporate development that makes up the majority of software development.

What Java really offered was allowing companies where software development is a cost-center minimize costs and reliably automate business processes with lower-skilled (and paid) software developers. Reliability is probably the number of one destroyer of value with software, and requires more development time, better developers, and the worst-case threat of downtime.

It "eliminated" most of the things that make software hard by simply getting rid of memory management as a problem. Doubling down on OOP and services everywhere let's companies divide and conquer software in ways that better align with business units. Many runtimes (VMs) had various types of sandboxing which boosted security in a very immature infosec world. Being slow (at the time) wasn't a problem as companies could just buy more hardware and still get the reliability that they needed.

Java has to be looked at as the entire ecosystem, and the TCO for an all Java world is generally lower than with an all C++ or C world, or an all Perl or Python world. Being the new COBOL is also why Java so rapidly took on "enterprise" cruft as a big part of its identity.

If you can get past all of that and are just looking for a nice, batteries included, language to work in, with a great cross-platformm highly performant, VM, Java is a great language. Its added a lot of nicer things in the last few years as its starting to shed a lot of dead weight that accumulated during the worst of the enterprise software ClassFactoryFactory years.

There are reasons it took off so well, and designing around the intended user community is one of them.

rr808 15 hours ago

Java and the JVM are actually very good. The real problem to me is the Java-enterprise way of thinking which usually involves Spring IoC container with too much magic that makes it really difficult to understand. Get off Spring its a great platform.

  • brabel 15 hours ago

    I've been a "mainly" Java programmer for 15+ years (I've used several other languages, but Java has remained the main one over all this time). I only did something like 1 year using Spring in one of my early jobs. So, when I saw some people online talking like "you ain't a Java programmer if you don't do Spring" I used to think they're complete idiots, I am living proof you can write lots and lots of Java and basically never encounter Spring.

    However, since then I've seen several surveys of JVM programmers, and apparently Spring is used in something like 80% of Java projects, so it's not surprising a majority of people, even Java developers, think that Spring is somehow mandatory if you're a Java programmer. But of course, it's just a framework, one of many, it's just the most popular one. Can you "do JS" professionally without knowing Javascript today? I'd think so. I guess React is about as dominant in the JS-world as Spring is in the Java-world.

  • jsight 7 hours ago

    I'm often shocked at how complicated Spring and Spring Boot are, but then I remember that they won against EJB 2.x.

    They are certainly less complex than a lot of the messes from that era.

  • dzonga 12 hours ago

    i'm no spring fan. but if you gonna make something that will stand the test of time - then yeah spring. it provides you with an architecture to do things properly, way to do test through test containers, connect all kinds of databases in a familiar manner.

    try the inverse and you will cry razor blade tears.

    • rr808 11 hours ago

      > it provides you with an architecture to do things properly

      This is what I disagree with. the Spring Object Container model is not a good framework to build on. When I have to maintain someone else's Spring project its a nightmare to figure out what does what.

      • gf000 7 hours ago

        It might be a nightmare to figure out, but guess what is more nightmare? Figuring out someone's shitty homegrown half of a "framework", with no documentation, full of bugs and vulnerabilities, etc.

        And that's the fair comparision in case of any sufficiently complicated business webapp.

        • rr808 4 hours ago

          I get that point about everyone rolling their own replacement. I'm not sure its different though as its common to try to hack Spring just to try to get it to work.

lordleft 16 hours ago

The fact that Java is still a go-to language for many companies, including technically sophisticated FAANGs like Google and Amazon, speaks to its robustness and utility. It’s a great language with staying power.

  • murukesh_s 15 hours ago

    Java is still the only go-to language for almost all of the fortune 100 (or 500) companies other than perhaps .Net. No other languages including Go, Rust (Difficult to get devs from consulting companies), Python, Typescript (Considered inferior by enterprise backend dev bros) are being used for building core backend APIs. Almost all the devs for these large enterprises are outsourced from large consulting companies like Infosys, Accenture, TCS, Wipro etc and all of them are still doing Java. I know it by working in large Banks and later trying to sell a non-Java platform to these companies and failing just because it was not written in Java..

    Also most of the large enterprises need distributed transactions as they use multiple databases and message queues in a monolith architecture - and no other language have the lib/tooling that can beat Java.

    • zozbot234 15 hours ago

      > Java is still the only go-to language for almost all of the fortune 100 (or 500) companies other than perhaps .Net

      One factor in that choice is that Java can run seamlessly and with official support on mainframe and midrange compute platforms that are still quite popular among Fortune 100 and 500 companies. (Many of them are building "private clouds" as a replacement but it's a slow transition.) While you might be able to get other languages to run, sticking to Java is a broadly sensible choice.

    • joshdavham 15 hours ago

      > No other languages including Go, Rust (Difficult to get devs from consulting companies) […] are being used for building core backend APIs.

      Could you elaborate a bit further? People at consulting companies don’t use Go or Rust? Also, do these top Fortune companies recruit from consulting companies often?

      • murukesh_s 15 hours ago

        Yup, I was also in for a surprise. I waited a decade for Java to be dethroned but no.

        https://digitalcareers.infosys.com/infosys/global-careers?lo...

        Just search for Rust or Go lang you can know why. Infosys employs 350,000 employees and almost all of them working for Fortune 500 companies. There is no single Rust or Go opening from what i can see. Go and Rust did not even make it into the dropdown.

        > top Fortune companies recruit from consulting companies often

        If you have worked in large Banks, Pharma, Automobile (IT), FMCGs you know. There will be a couple of middle managers onsite (i.e. in US) and rest of the devs, often hundreds of, are located in the offshore (Asia/South America).

      • bitwize 15 hours ago

        By "consulting companies" he means indentured-servitude shops that rent out programmers by the hundreds to large companies and even governments. You know, like Deloitte or Accenture.

        • murukesh_s 15 hours ago

          And almost all of them are trained in Java, .NET, SAP, Oracle etc. May be 0.05 % get trained in Rust or Go lang for some specialised requirements within a business division or so..

stephenlf 17 hours ago

> Hype is about excitement; it’s about the tantalising possibility that if you jump on board at just the right time, you’ll become part of something unprecedented and maybe end up rich and famous along the way. In the late 90s and early 2000s, a lot of people did exactly that – and, yes, many of them used Java along the way, and a fair few of those got rich and famous by getting in right at the beginning, and getting out before anybody realised their idea was never going to work

Did you just look three years into the future and write about the GenAI hype?

  • immibis 9 hours ago

    Probably writing in the present about cryptocurrency.

w10-1 13 hours ago

It's not illuminating to make inferences from freemium marketing shadows or feature comparisons, particularly if you want lessons you can apply today.

Follow the money.

Initially it was from VM licenses, from netscape for browsers, Oracle for databases, and Borland for IDE's (Borland also wrote the first JIT). But except for databases, they were non-exclusive, and JavaSoft's free offerings undercut their licensees.

Then IBM cut a 10 year deal while Microsoft's license went to court for trying to add features to get lock in. At this time IBM created a free Eclipse to undercut the IDE market (Borland), with SWT as an alternative to Swing to capture developers for leverage.

But the big money was in enterprise, so J2EE licensing was much more airtight for Oracle, BEA, et al. That was a successful and long-lived franchise that also drove storage and compute revenues.

But people hated the complexity and compute resources and Google and Apple both decided to build rather than buy, so we got Spring, Swift, Go, and the whole native and container ecosystem.

You have to be aggressive and strict in building a monopoly, but you should be gentle and forgiving in maintaining it. Both Microsoft and AWS learned this lesson.

ynzoqn 4 days ago

> it was used throughout my degree course, right up to the final year module on programming language design where our coursework assignment was to build a Scheme interpreter – in Java.

It sounds good.

liampulles 15 hours ago

I started out as a Java dev. I came in around Java 8, and I loved using the streams API (still do) - trying to stream through everything is just enormous fun. And I loved adding all sorts of Spring magic to make programs do all sorts of fun things. And I loved trying to use all sorts of package private protected stuff to define an intricate architecture, and make complex generic utilities to solve any 2 variations of an implementation.

And then, of course, I woke up and smelled the roses, and realized the mess I was making.

binarymax 16 hours ago

I will die on the hill that Java is inferior because it doesn’t have native support for unsigned numerics.

  • layer8 15 hours ago

    I used to think that way too, but there’s a good argument to be made that overflowing your integer types to negative values instead of to small (or large) positive values avoids a lot of silent bugs you’d otherwise have with unsigned types. A language solving that would need to work with static proofs of non-overflow, and have any desired overflow to be explicit.

    Java in the meantime has gained all the unsigned operations as methods in the Integer and Long classes, so the relatively rare cases when you need them are straightforward to handle.

    The only real annoyance is that byte is signed. At least there’s a bit of unsigned support in the Byte class now.

    Lastly, minor point, Java actually has an unsigned 16-bit integer type, called char.

  • rf15 15 hours ago

    I'm doing Java for my main work, and boy, this still doesn't sit right with me after decades in the space. just give me my properly unsigned bytes, please

    • pron 15 hours ago

      If x is a "properly unsigned" byte that has the value 1, what is the value of `x - 2 > 0` and why?

      The choice of having unsigned types or not is always one of the lesser evil, and in a language where emitting signals directly to hardware ports is not a primary use case, the argument that not having these types is the lesser evil carries a lot of merit.

  • pron 15 hours ago

    Other than unsigned types for FFI or wire formats (which Java supports just fine) or for bitfields (which Java doesn't have), what do you want unsigned numerics for?

    The risk of unsigned types (even without the C or C++ issues around mixing with signed types) is that too many people make the mistake of using them to express the invariant of "a number that must be positive", which modular arithmetic types are a really bad fit for.

    One possible use is for a memory-efficient storage of small positive values, say in a byte. But then you have to make a choice between forcing the value into a signed type for arithmetic (which Java easily lets you do with Byte.toUnsignedInt) and allowing signed and unsigned types to be mixed.

  • hashmash 15 hours ago

    The problem with having unsigned integer types is that it introduces new type conversion issues. If you call a method that returns an unsigned int, how do you safely pass it to a method that accepts an int? Or vice versa? A smaller set of primitive types is preferred, since it has fewer conversion issues.

    Unsigned integer types are only really necessary when dealing with low-level bit manipulation, but most programs don't do this. The lack of unsigned integers makes low-level stuff a bit more difficult, but it makes the language as a whole much easier. It's a good tradeoff.

    • TuxSH 14 hours ago

      > If you call a method that returns an unsigned int, how do you safely pass it to a method that accepts an int?

      Mandate 2's complement be used.

      > Unsigned integer types are only really necessary when dealing with low-level bit manipulation

      They also give one more bit of precision, useful when dealing with 32-bit integers (or below)

    • binarymax 15 hours ago

      Literally every other language with unsigned types handles this just fine?

      • hashmash 15 hours ago

        I guess it depends on what "just fine" means. What happens when a conversion is applied? Is there silent data corruption (C), or is there an exception (Ada, D) or perhaps a panic (Rust, Zig)? Is the behavior dependent on compiler flags?

        Keeping unsigned integer types out of the language makes things much simpler, and keeping things simple was an original design goal of Java.

      • pron 15 hours ago

        By no means do C or C++ handle unsigned types just fine. In fact, they're widely recognised as a source of problems even by those who think they're useful (when used carefully).

jmyeet 16 hours ago

I suspect many today don't fuly appreciate the context of Java in the 1990s and how different the outlook was. A lot of what we take for granted now wasn't even imagined and even if it was, it wasn't certain. Java was hyuped on three fronts:

1. Desktop applications;

2. Server applications; and

3. Browser applications.

We had more platforms then. On the desktop front, Mac was in decline but still existed and was strong in certain niches. On the server front, there were many UNIX variants (eg Solaris, HP/UX, Digital Unix, etc). Cross-platform really was a big deal and much more relevant.

We still had desktop apps then. Being able to write a Swing app and run it "everywhere" was a big deal. Otherwise you wre writing things in thing slike Visual Basic (and thus Windows only) or I don't even know what Mac used at the time.

On the server, this was the very early days of Web servers. Netscape still existed and sold Web server software. The most common form of HTTP serving was CGI bin. These had a significant start up cost. There were other solutions like ISAPI/NSAPI. Java brought in servlets, which were persistent between requests. That was massive at the time.

It creates problems too but it's all tradeoffs.

And the last is Web applications. This was the newest area and had the most uncertain future. Java applets were pushed as a big deal and were ultimately displaced by Macromedia (then Adobe) Flash, which itself is now (thankfully) dead and we have Javascript applications. That future was far from certain in the 1990s.

I remember seeing demos of Java applets with animations and Microsoft doing the same thing with a movie player of all things.

Single page applications simply didn't exist yet. If you wanted that, and honestly nobody did, it was a Java applet or maybe a Flash app. The Web was still much more "document" oriented. Server a page, click something, server a page for that and so on.

I still wrote this form of Website in the 2000s and it could be incredibly fast. PHP5+MySQL and a load time sub-30ms. You could barely tell there was a server round trip at all.

So Java still exists but almost entirely in the server space. It's probably fallen away from other platforms like PHP, Python, Node.js, etc. But it absolutely changed the direction of tech. Java has left a lasting legacy.

I would go as far as saying that Java democratized the Internet. Prior to Java, every solution was commercial and proprietary.

  • rr808 8 hours ago

    Agreed, the dotcom boom was built on Java. Unfortunately it really backfired on Sun when everyone (who didn't die) figured WORA meant they didn't have to run on expensive Sun servers.

nurettin 14 hours ago

Today some of the most common development tools like pycharm, android studio and dbeaver are java programs. Your java programs will run on the most obscure platforms (aix, as400) as promised on the package. So despite all the hype and sales tactics they must have done something right.

vardump 17 hours ago

I hate "hype" as a word. It's often used without justification to bash something that is new and popular.

  • pixl97 16 hours ago

    Hype is when popularity goes beyond substance.

    • vardump 15 hours ago

      Also often when popularity is earned. Unfortunately.

  • jsight 7 hours ago

    I agree with you, tbh. Whether something is over hyped or under hyped is a truly strange thing to debate given that the position can almost never be objectively measured.

more_corn 16 hours ago

I’ve hated Java since the day I first met it. That hatred has only grown as I’ve had to secure and optimize the jvm. If Java died today I would toast its demise and piss on its grave.

  • murukesh_s 16 hours ago

    Java served it purpose and served it well.. remember it was replacing large mainframe systems and developers needed their fair share of confidence - Types, JIT compiler, JDBC, JMS, EJB (I hated it though - but had the "enterprise" word in it to sell!), transactions, especially distributed transactions which was uncommon in other languages. In many languages the logic was scattered across multiple libs and frameworks - providing inconsistencies.

    Even now in fortune 100 I don't think they use anything other than Java to perform daily tasks. Yea they are now more open to use Python for ML related tasks and Node.js for compiling React and all but anything backend its Java

  • mdaniel 16 hours ago

    I'm hearing that you don't hate Java, you hate the JVM

    To help folks understand your perspective, what would you replace it with?