Core Language

Phix has just five builtin data types:
        <-------- object --------->
        |                |
        +-atom           +-sequence
          |                |
          +-integer        +-string
       
  • An object can hold any Phix data, specifically either an atom or a sequence.
  • An atom can hold a single floating point numeric value, or an integer.
  • An integer can hold a single whole number (at least +/- 1,000,000,000).
  • A sequence can hold a collection of values, nested to any depth, or a string
  • A string can hold a series of characters, or raw binary data.
At the very low level, and only when needed, Phix has all the tools required (about a dozen) to allocate some memory, set the appropriate bits and bytes, invoke operating system or third party routines (in a dll/so, typically written in C), and finally (you can automate this) release memory for reuse. It is also possible to do all that directly with inline assembly (which most users should quite rightly avoid). Such code (eg builtins\VM or demo\pGUI) is normally written once and used in several different projects.

However, for day-to-day use at the normal high/human-readable level, in Phix you just use the five simple types listed above. It is entirely expected that anyone who has experience of more "advanced" languages might blurt out "ridiculous" or something similar or cruder at this point. However, the Phix compiler/interpreter, which can re-compile itself (four times) in just 6 seconds, the Edita programmers editor, all the demonstration programs included in the package, and indeed the program used to generate this very help file (docs\phix\makephix.exw), use nothing else, and are testament to the fact that the above five simple types are more than perfectly adequate.

You could theoretically write an entire application declaring all variables and parameters as type object, except that it would probably not catch errors the way you might expect it to. While Phix does also allow user defined types to be declared, they are used primarily for validation and debugging purposes, rather than being fundamentally different to the above.