Like Common Lisp, Python, Ruby and Perl 6 and to some extent unlike Perl 5, Spark will have a rich type system. However, it won’t be strongly typed like Haskell. If Spark had been going to be strongly typed, it could no longer have been considered a Lisp, and I happen to like dynamic typing.
A variable can be assigned different values with different types during its run-time, and functions would be able to accept variables of any type (unless they specifically forbid it).
The Spark type system will be extendable at run time, and will be analogous to its Object Oriented Programming (OOP) system. As a result, one would be able to call methods on pieces of data, on expressions, on S-expressions, and on functions, macros, classes and method declarations, and on their application.
In Spark “everything will be an object”, but unlike Java, it won’t be overly-OO. One won’t need to instantiate a class and declare a method just to print “Hello World” on the screen. This will work:
$ spark -e '(say "Hello World")'
Or:
$ spark -e '(print "Hello World\n")'
Or:
$ spark -e '(-> "Hello World" say)'
→
Is a simple macro that converts (→ obj method args)
to (method obj args)
and is used for some syntactic sugar.