By default, Swift does not implicitly cast number types for arithmetic operators. For example, you’ll get a compiler error for the following code in Swift:

var anInt: Int = 10
var aFloat: Float = 20.0
var result = aFloat / anInt

Instead you have to cast one of them to match the other, like so:

var anInt: Int = 10
var aFloat: Float = 20.0
var result = aFloat / Float(anInt)

I discovered this when I first sat down to play with Swift in the labs, but it quickly made sense to me as a design choice. It made me think a lot more explicitly about the underlying types and what I really want from the operation. This bit of code on GitHub would make the first code snippet work. I know it’s tempting to use something like it to make life easier, but part of the beauty of Swift is the language forces you to think about types more explicitly. Doing so should result in better code, and will help get rid of a whole class of problems you may have down the road.