Java of Antiquity

hsoi blog, talk 0 Comments

While Swift and Objective-C might approach nil/null differently, ultimately they both allow me to do something with the syntax:

Forget about it.

While working in Java, dereferencing null is possible and of course leads to crashes. Thus, everywhere in code you have to do this:

if (foo != null) {
    foo.something();
}

Which becomes verbose and tiresome (and error prone).

In Objective-C, if foo was nil it’d be no big deal to message it: [foo something];.

In Swift, foo wouldn’t be nil because we’d prefer non-optionals, thus I am safe in knowing that foo is valid and I can just call away: foo.something(). But of course, if foo was optional, sure I’d have to add in a ? but then I could still go ahead and call away, still doing the right thing: foo?.something().

I may be working in a modern environment, doing modern mobile development, but having to use Java? It feels like such a step backwards. I grant there’s history involved here, but if Apple could “abandon” Objective-C for Swift, Google certainly could abandon Java for… Kotlin? Go? Swift?

Leave a Reply