Swift Factorial

6th July 2014 | Programming

With the introduction of Swift at WWDC 2014, I was eager to begin experimenting with this new language. My initial impression of Swift was that it was like learning a new language which is vaguely foreign, yet inherits enough niceities/commonalities from other languages, (such as C, Scala, Ada, and of course, Objective-C) that it doesn't feel overly strange.

Following up on a recommendation, I decided to design a small Mac application with Swift that would calculate factorials, including non-integer numbers. The result is Swift Factorial. The source code is available on GitHub. The Mac app can be downloaded here.

As simple and straightforward of an exercise this was, it proved useful in resolving an issue in calculating factorials for floating-point numbers on Intel Macs. EdenMath and EdenGraph currently use the following logic to calculate the factorial for floating-point numbers:


	lg = lgamma(currentValue+1.0);
	currentValue = signgam*exp(lg); /* signgam is a predefined variable */ 

This worked fine originally when EdenMath 1.1.1 and EdenGraph 1.2.0 were introduced in 2004 and Macs were powered by PowerPC processors. However, something doesn't seem to work quite right initially on Intel chips (perhaps the signgam isn't set yet), and the first run of a factorial returns 0. The new solution simplifies the calculation of non-integer factorials down to one line:


	newValue = exp(lgamma(originalValue + 1)) 

This code fix will be incorporated into future versions of EdenMath and EdenGraph.