This is why nobody likes Java
Saturday, June 17th, 2017 | Programming
Recently, I wanted to pass a random number into a unit test. Sounds simple, right? It probably would be if I wasn’t writing it in Java.
The problem is that the class expected a BigDecimal. But the random utils returns a string. And you can’t convert a string to a BigDecimal. So I had to convert it to a Long, and then convert that to a BigDecimal. Here is the code I ended up with:
BigDecimal pageNumber = BigDecimal.valueOf(Long.valueOf(RandomStringUtils.randomNumeric(1)));
Which started me wondering: how many ways to represent a number are there in Java? So, I looked it up. And came up with this list:
- AtomicInteger
- AtomicLong
- BigDecimal
- BigInteger
- Byte
- double
- Double
- float
- Float
- int
- Integer
- long
- Long
- short
- Short
Some of these are understandable. It makes sense to have separate storage for decimals and integers, for example. But do we really need a short and a Short? And a total of 15 different types of number? It’s madness.
It wouldn’t be so bad if you could just compare the two of them. Or pass in a number to a function. But it is a strongly typed language. Which means a world of pain when people use different types.
But that is what you get for trying to use a proper language, I guess.
Recently, I wanted to pass a random number into a unit test. Sounds simple, right? It probably would be if I wasn’t writing it in Java.
The problem is that the class expected a BigDecimal. But the random utils returns a string. And you can’t convert a string to a BigDecimal. So I had to convert it to a Long, and then convert that to a BigDecimal. Here is the code I ended up with:
BigDecimal pageNumber = BigDecimal.valueOf(Long.valueOf(RandomStringUtils.randomNumeric(1)));
Which started me wondering: how many ways to represent a number are there in Java? So, I looked it up. And came up with this list:
- AtomicInteger
- AtomicLong
- BigDecimal
- BigInteger
- Byte
- double
- Double
- float
- Float
- int
- Integer
- long
- Long
- short
- Short
Some of these are understandable. It makes sense to have separate storage for decimals and integers, for example. But do we really need a short and a Short? And a total of 15 different types of number? It’s madness.
It wouldn’t be so bad if you could just compare the two of them. Or pass in a number to a function. But it is a strongly typed language. Which means a world of pain when people use different types.
But that is what you get for trying to use a proper language, I guess.