Archive for September, 2016

Hexlicense at 40% discount!

September 28, 2016 Leave a comment

Delphi License management

Just want to remind you that HexLicense, a purely Delphi native license management component suite, is now offering you a 40% discount!

The discount is for the VCL binary distribution. However, if you buy the original $200 offer you also get access to the source-code (which compiles for Android and iOS).

The discount expires on 07.10.2016 so you have 9 days to take advantage of the offer!

License management?

All programmers that sell their software will sooner or later need to add some security to their applications; but even more important is to keep track of what serial numbers belongs to what customer. After all, a customer has invested in your product and the serial number is what secures access to the functionality you provide.

Online software vendors operate with lists of pre-defined serial numbers that you have to upload together with your binaries. When a sale transaction is completed, the online vendor removes one serial number from your list, delivers it to your customer, and you get notified through e-mail.

Generating serial numbers in bulk

Hexlicense is not just another key generator. It doesn’t simply play around with letters or encode a customers name into a fancy hex string. Those solutions are trivial at best and dangerous at worst. Hexlicense uses a technique not unlike certificates key chains (but much easier to use).

With HexLicense you generate licenses from a root key, which is an array of 12 random bytes (you can also input whatever values you wish). From that key the license generator application can generate thousands of unique serial numbers, serial numbers which can only be mathematically validated with that exact root-key.

The HexLicense key generator makes creating license batches and root keys easy. You can export lists in xml, json, text and cds (TClientDataset) format

The HexLicense key generator makes creating license batches and root keys easy. You can export lists in xml, json, text and cds (TClientDataset) format

The system gives you plenty of callback events which means hackers have to manually disassemble your application. In software security this time-frame is called your window of sale. The longer you can keep this window open, either by limiting the number of licenses for each root key (1000 keys per root is common, then recompile to make sure the binaries are different) and other techniques – the more sales you can secure.

But license management is first of all about the customer, not the hacker. License management is about keeping track of what product a customer has bought, the duration of such a license, trial types and building up a customer relationship.

Pure object pascal

The HexLicense package is pure object pascal, it comes in the form of 7 TComponent based controls. You also get a very well documented example that shows you how to validate a serial number, turning a trial application into a fully working application.

There is also a Delphi quick start guide on the website that explains everything you need to know (literally), so check it out. You can add license management to your existing applications in minutes.

The Delphi quick-start tutorial gives you all the information you need for turning your application into trial-mode, and activating the full set of features with a valid license key.

The Delphi quick-start tutorial gives you all the information you need for turning your application into a trial version, and how to activate the full set of features with a valid license key.

Generating licenses by code

The license manager application that you use to generate serial numbers is also written purely in Delphi. And it uses the exact same components that you find in the package. So the generator component is also available to you.

In other words, you can easily integrate HexLicense on your company server for better control over validation and registration.

HexLicense provides ready to use building blocks, all you have to do is to input a few properties, connect the components together, handle two events – and you are ready to go!


JSON structures in Smart Pascal

September 18, 2016 2 comments

JSON is more or less built into the JavaScript virtual machine [JSVM] which makes it very convenient to use. A simple call to JSON.stringify() is just about all you need to turn any JS object into a pristine JSON structure. And to turn it back again into an instance you just call JSON.parse().

Now arguing that JSON could be made simpler or in any way improved within Smart Mobile Studio would be, well, a bit lame. But what can be done is to abstract the use of JSON a bit. Most SMS coders are working quite close to the JSVM, mixing object pascal with JavaScript. That is one of the strengths of Smart Pascal, but it can also be regarded as a weakness.


It doesnt get easier than this, or does it..

If you ever used Turbo Pascal, High Speed Pascal or any of the “old time” compilers, you no doubt remember what happened when Delphi 1 came out? All that code written for Turbo became obsolete faster than it had to. Why? Because back then we mixed and matched Pascal with assembler. And when you wanted to move that code from DOS to Windows, you could kiss old-school assembler good-bye. The registers didn’t match up, the new platform was 32 bit, the way Delphi and Turbo used the stack and organized instances was different — long story short: you couldn’t just copy the code over.

While JavaScript is not going to change any time soon, it will not stay the same forever. And with that in mind I try my best to abstract the RTL as much as possible from the low-level code that actually get’s the job done. Who knows, one day we may actually turn around and implement that LLVM compiler that’s been floating around 😉


One of the units that is in the new RTL deals with structures. Not just JSON but also pure binary structures just like Delphi or Freepascal would allocate.

So there is an abstract baseclass simply called TW3Structure in the unit “System.Structure.pas” (yeah i know, the names are cunning). And it simply looks like this:

  EW3Structure = class(EW3Exception);
  TW3Structure = class(TObject)
    procedure   WriteString(Name: string; Value: string; const Encode: boolean);
    procedure   WriteInt(const Name: string; value: integer);
    procedure   WriteBool(const Name: string; value: boolean);
    procedure   WriteFloat(const Name: string; value: float);
    procedure   WriteDateTime(const Name: string; value: TDateTime);

    function    ReadString(const Name: string): string;
    function    ReadInt(const Name: string): integer;
    function    ReadBool(const Name: string): boolean;
    function    ReadFloat(const Name: string): float;
    function    ReadDateTime(const Name: string): TDateTime;

    function    Read(const Name: string): variant;virtual;abstract;
    procedure   Write(const Name: string; const Value: variant);virtual;abstract;

    procedure   Clear;virtual;abstract;

    procedure   SaveToStream(const Stream: TStream);virtual;abstract;
    procedure   LoadFromStream(const Stream: TStream);virtual;abstract;

Next you have the JSON implementation in System.Structure.JSON, the XML version in System.Structure.XML – and raw binary in System.Structure.binary.

JSON serialization without baggage

When serializing with JSON we have so far operated either with variants (which you can turn into a JSON object with TVariant.CreateObject) or records. This is because raw serialization affects everything, including the VMT (virtual method table). So if you serialize a TObject based instance, it will include the VMT and whatever state that is in. Which will cause problems when de-serializing the object back depending on the complexity of the class.

In the last major update Eric (our compiler wizard) introduced classes that does not root in TObject. As you probably know, TObject has been the absolute base object in object-pascal for ages. You simply could not create a class that does not inherit from TObject.

Well, that is no longer the case and Smart Pascal allows you to define classes that are not based on TObject at all. This class form serves an important purpose: namely that they can be defined as “external”, so you can map these to JavaScript code directly. Like say a library you have included that exposes objects you can create. If you know the structure, then just define a class for it – and you can create it like any other object (!)

But what about creating pure JS based objects via code? Not defined classes or typed structures, but using code to carve out objects from scratch? Imagine it as being able to define what a record should look like – but at runtime. Sounds interesting? Indeed it does. And without much fuss I give you the TW3JSONObject class.

While somewhat pointless at first sight, it does make writing packets easier for NodeJS services. And extracting info is likewise a snap:


  TW3JSONEnumProc = function (Id: string; Data: variant): boolean;

  TW3JSONObject = class
    function GetPropertyCount: integer;
    function    ToString: string;
    procedure   FromString(const JSonText: string);
    function    Equals(const OtherObject: TW3JSONObject): boolean;
    function    Compare(const OtherObject: TW3JSONObject): boolean;

    function    &Read(const PropertyName: string): variant;
    procedure   &Write(const PropertyName: string; const PropertyData: variant);overload;
    function    AddOrSet(const PropertyName: string; const PropertyData: variant): TW3JSONObject;

    function    QueryPropertyNames: TStrArray;
    procedure   Clear;

    function    ForEach(const Callback: TW3JSONEnumProc): boolean;overload;

    function    &Contains(const PropertyName: string): boolean;overload;

    class function ForEach(const ObjReference: variant;
      const Callback: TW3JSONEnumProc): boolean;overload;

    class function &Contains(const ObjReference: variant;
      const PropertyName: string): boolean;overload;

    property    Values[const Id:string]: variant read &Read write &Write;default;
    property    Count:integer read GetPropertyCount;

    constructor Create;overload;virtual;
    constructor Create(const ObjReference: JObject);overload;virtual;
    constructor Create(const FromPrototype: Variant);overload;virtual;

    destructor Destroy;override;

{$R 'object.keys.shim.js'}

__RESERVED: array of string = [

// TW3JSONObject

constructor TW3JSONObject.Create;
  inherited Create;

constructor TW3JSONObject.Create(const ObjReference: JObject);

constructor TW3JSONObject.Create(const FromPrototype: Variant);

  (* perform a non-recursive prototype clone of values *)
  if FromPrototype.Valid then
      function (Id: string; Data: variant): boolean
        if not (id in __RESERVED) then
        result := true;

destructor TW3JSONObject.Destroy;

class function TW3JSONObject.&Contains(const ObjReference: variant;
      const PropertyName: string): boolean;
  result :=;

class function TW3JSONObject.ForEach(const ObjReference: variant;
      const Callback: TW3JSONEnumProc): boolean;
  if assigned(Callback) then
    if  ObjReference.Valid then
      if ObjReference.IsObject then
        for var LItem in ObjReference do
          result := Callback(LItem, ObjReference[LItem]);
          if not result then
      end else
      raise Exception.Create('Enumeration failed, render is not an object error');
    end else
    Raise Exception.Create('Enumeration failed, reference is null or unassigned error');

function TW3JSONObject.ForEach(const Callback: TW3JSONEnumProc): boolean;
  if assigned(callback) then
    for var LItem in ThisContext do
      result := Callback(LItem, ThisContext[LItem]);
      if not result then

procedure TW3JSONObject.Clear;
  var LItems := QueryPropertyNames;
  for var LProp in LItems do
      delete this[@LProp];

function TW3JSONObject.QueryPropertyNames: TStrArray;
  (* Perform manual enumeration *)
  for var LItem in ThisContext do
    // Avoid standard items
    if not (LItem in __RESERVED) then
      if not (Variant(ThisContext[LItem]).Datatype in [vdUnknown, vdFunction]) then
        if,LItem) then

function TW3JSONObject.GetPropertyCount: integer;
  if (ObjectPrototype.keys) then
    (* Object.Keys() is supported by all modern browsers, but equally
       important: NodeJS and IO as well.
       There are however older runtime environments in circulation, so we need
       fallback mechanisms to ensure behavior. *)
    result :=;
  end else
    for var LItem in ThisContext do
      if,LItem) then

function TW3JSONObject.&Contains(const PropertyName: string): boolean;
  result :=;

function TW3JSONObject.Read(const PropertyName: string): variant;
  result := ThisContext[PropertyName];

procedure TW3JSONObject.Write(const PropertyName: string;
  const PropertyData: variant);
  ThisContext[PropertyName] := PropertyData;

function  TW3JSONObject.AddOrSet(const PropertyName: string;
      const PropertyData: variant): TW3JSONObject;
  ThisContext[PropertyName] := PropertyData;
  result := self;

function TW3JSONObject.ToString: string;
  result := JSON.Stringify(self);

(* Compare does a property-name and property-value check *)
function TW3JSONObject.Compare(const OtherObject: TW3JSONObject): boolean;
  if assigned(otherobject) then
    result := ForEach(
      function (Name: string; Data: variant): boolean
        result :=,Name)
        and OtherObject.Read(Name) = Read(Name);

(* Equals does a structural compatability check only *)
function TW3JSONObject.Equals(const OtherObject: TW3JSONObject): boolean;
  if assigned(otherobject) then
    result := ForEach(
      function (Name: string; Data: variant): boolean
        result :=,Name)

procedure TW3JSONObject.FromString(const JSONText: string);
  LTemp: variant;
  LObj: variant;
  LObj := JSON.Parse(JSONText);
  if (LObj) then
    for LTemp in LObj do
      if not (LTemp in __RESERVED) then
        ThisContext[LTemp] := LObj[LTemp];

Here is a little example code of how it’s used:

  var raw := TW3JSONObject.Create;
     .AddOrSet("Yet another!", TW3JSonObject.Create);

And extracting property information is likewise easy:

  var props := raw.QueryPropertyNames.join(",");


3D mathematics [book]

September 11, 2016 Leave a comment

It’s not often I promote books, but this time I’ll make an exception: Mathematics for 3d programming and computer graphics.

Sooner or later, all game programmers run into coding issues that require an understanding of mathematics or physics concepts such as collision detection, 3D vectors, transformations, game theory, or basic calculus

A book worth every penny, even if you dont use 3d graphics very often

No matter if you are a Delphi programmer, Smart Pascal, Freepascal, C# or C++; sooner or later you are going to have to dip your fingers into what we may call “primal coding”. That means coding that was established some time ago, and that have since been isolated and standardized in APIS. This means that if you want to learn it, you are faced with the fact that everyone is teaching you how to use the API — not how to make it or how it works behind the scenes!

3D graphics

Once in a while I go on a retro-computer rant (I know, I know) talking about the good ol’ days. But there is a reason for this! And a good one at that. I grew up when things like 3d graphics didn’t exist. There were no 3d graphics on the Commodore 64 or the Amiga 500. The 80’s and early 90’s were purely 2d. So I have been lucky and followed the evolution of these things long before they became “standard” and isolated away in API’s.

Somewhere around the mid 80’s there was a shift from “top-down 2d graphics” in games and demo coding. From vanilla squares to isometric tiles (actually the first game that used it was Qbert, released in 1982). So rather than building up a level with 32 x 32 pixel squares – you built up games with 128 degrees tilted blocks (or indeed, hexagon shaped tiles).

This was the beginning of “3D for the masses” as we know it because it added a sense of depth to the game world.


Qbert, 1982, isometric view

With isometric graphics you suddenly had to take this depth factor into account. This became quite important when testing collisions between sprites. And it didn’t take long before the classical “X,Y,Z” formulas to become established.

As always, these things already existed (3D computation was common even back in the 70s). But their use in everyday lives were extremely rare. Suddenly 3d went from being the domain of architects and scientists – to being written and deployed by kids in bedrooms and code-shops. This is where the european demo scene really came to our aid.

Back to school

This book is about the math. And it’s explained in such a way that you don’t have to be good in it. Rather than teaching you how to use OpenGL or Direct3D, this book teaches you the basics of 3D rotation, vectors, matrixes and how it all fits together.

Why is this useful? Because if you know something from scratch it makes you a better programmer. It’s like cars. Knowing how to drive is the most important thing, but a mechanic will always have a deeper insight into what the vehicle can and cannot do.


Every facet is explained both as theorem and practical example

This is the book you would want if you were to create OpenGL. Or like me, when you don’t really like math but want to brush up on old tricks. We used this in demo coding almost daily when I was 14-15 years old. But I have forgotten so much of it, and the information is quite hard to find in a clear, organized way.

Now I don’t expect anyone to want to write a new 3D engine, but 3D graphics is not just about that. Take something simple, like how an iPhone application transition between forms. Remember the cube effect? Looking at that effect and knowing some basic 3D formulas and voila, it’s not a big challenge to recreate it in Delphi, C++, C# or whatever language you enjoy the most.

I mean, adding OpenGL or WebGL dependencies just to spin a few cubes or position stuff in 3D space? That’s overkill. It’s actually less than 200 lines of code.

Well, im not going to rant on about this — this book is a keeper!
A bit expensive, but its one of those books that will never go out of date and the information inside is universal and timeless.




The case for Raspberry PI, FPGA and AmigaOS 4.1

September 10, 2016 3 comments

First, thanks for so many good comments on the whole Amiga retro-emulation concepts. I think there is a sort of resurgence today of the whole retro gear thing. On Facebook the Amiga forums and groups are growing, and there is really a sense of community there. Something I havent experienced with the Amiga for well over a decade (or was it two?).

To those that grew up without an Amiga we “old timers” must seem nuts. But that is to be expected by generations growing up with 24/7 internet connections. I’m not dizzing young programmers in any way, absolutely not; but I will make a case that you are missing out on something very valuable in terms of learning and evolving your skill.

“It’s just that it’s based on pre-existing hardware, not an imaginary instruction-set that
assaults the stack while raping the instruction cache”

The term “personal computer” (PC) doesnt really have any meaning today. I see that with my son as well. He has no personal relationship with his computer at all. The computer is a means to an end for him and his friends – a portal if you like, to the reality on the internet. Be it steam, Photoshop express, chatting or whatever. Young coders sort of have a split reality, where their friends online that they have never meet plays a bigger role in their lives than, well, their best friend across the street.

Classic Amiga Workbench

Classic Amiga Workbench running under Amibian

People who grew up without the internet had only their computer to deal with. It was the center of music, demos, games and creativity. Be it coding, graphics, sound or whatever was the interest. The result was naturally that you created bonds to that computer that, to other people, could seem odd or silly. But the phrase “personal computer” is not just a throwback to the time when you no longer needed a campus mainframe or terminal. It also hints to a more personal approach to technology. Which is easy to forget in an age where you switch mobile phones once a year, and the average phone has more computing power than was on the planet in the 1970’s.

Amiga emulation; why it’s a good thing

If we forget the visual aspects of the grey “classical” Amiga OS for a moment and put the looks on the backburner — why on earth should a modern programmer or computing enthusiast even consider Amiga OS? What could a 30-year-old tech bring to a modern world of high-powered CPU and GPU driven monsters?

In a word: efficiency.

AmigaOS thrives with just one megabyte of memory. Stop and think about that for a moment. The core operating system itself resides in a 512kb (half a megabyte) ROM – and the rest fits nicely on a couple of 720kb disks. So if we say that a full desktop experience can fit in 4-5 megabytes (if we include the programs, accessories and extras), what does that tell you?

It should tell you something about how the code is written. But secondly it should tell you about how we write code today (!)

“You think Linux is a fast and efficient operating system? You don’t have a clue”

An average Microsoft Windows installation is what? 16 gigabytes? You can probably trim it down to 8 gigabytes by removing services, graphics and drivers you don’t use. There is also a huge difference in the size of executables and the amount of information stored in the binaries — but ultimately it comes down to a shift in mindset that occurred back in the late 90’s: rather than forcing coders to write efficient programs, the emphasis was placed on the hardware to deliver enough power to run crap and bloated code.

Now being a programmer myself I have no illusions that if AmigaOS, this time the modern and latest 4.x version, was ever re-compiled for x86 it would naturally result in bigger binaries. Depending on the amount of drivers, you would probably end up with at least 512 megabytes to 1 gigabyte of software. Add presentation and media to that and we are quickly breaching the 1.5 to 2 gigabyte boundary. But that in itself would be revolutionary compared to the size of Ubuntu or Windows. Yet the core of the operating system is so small that many young developers find it hard to believe.

And yes I know the Linux kernel and base packages can be squeezed down. But in all honesty, Amiga has much more interesting system. Some of the stuff you can do with shell scripting and Arexx on the Amiga, the lack of cryptic complexity, the ease of use and control you as an end-user had; im sorry but Linux is anything but user-friendly.

Why Raspberry PI

By any modern measure, the Raspberry PI is an embedded board at best, and a toy at worst. It exists there between the cusps of single-function boards and a modern computer. But is it really that bad? Actually, its bloody smashing. It’s just that people havent really been able to run anything written specifically for it yet.

Amibian, a debian based distro that boots straight into UAE (Unix Amiga emulator) and runs classical 16/32 bit Amiga OS, presently performs at 3.2 times the speed if an Amiga 4000\o60. So for $35 you will own the most powerful Amiga ever devised. If you take it one step further and overclock the PI (and add a heat-sink so you don’t burn out the SoC) it emulates the Amiga operating system roughly 4 times the speed of the flagship high-end Amiga of the late 90’s and early 2k’s. You also get 32bit graphics, HDMI output, USB device access through the Linux sub-layer, built-in tcp/ip (and WiFi built-in on the model 3b). And naturally: a hell of a lot more ram than the Amiga even needs (!)

Not exactly technologically impaired

AmigaOS 4, not exactly technologically impaired

Now remember, this is emulated on 68k instruction level (!) It is practically the same as running Java or CLR bytecodes (!) Which is a good parallell. People ask me why i bother with 68k; My reply is: why the hell do you bother with Java bytecodes if you don’t have a clue what a virtual machine is! An emulator is a virtual machine in the true sense of the phrase. It’s just that it’s based on pre-existing hardware, not an imaginary instruction-set that assaults the stack while raping the instruction cache (yeah I’m looking at you Java!).

Imagine then for a second what the situation would be if Amiga OS was compiled for Arm, running natively on the Raspberry PI with direct access to everything. You think Linux is a fast and efficient operating system? You don’t have a clue.

I mean, the PI was initially created to deliver cheap computing power to schools and educational centers, not to mention third-world countries. It made big waves as it blew the ridicules “$100 one PC per child” campagne out of the water (which was more an insult to the poor living in Africa than anything resembling help). Yet at the end of the day – what do these third world countries have to work with? Raspbian and Ubuntu are usable, but only superficially.

Try compiling something on the PI with a modern compiler. What would take less than a second to compile under Amiga OS can take up to 45 minutes to build under Linux on that SoC. If a kid in Africa starts learning C++ with Linux on a PI, he will be 59 years old before he can even apply for a job.

AmigaOS 4 is a sexy desktop

AmigaOS 4 is a sexy desktop

If AmigaOS 4 was ever compiled and setup to match the SoC firmware (which is also a benefit about the PI, the hardware is fixed and very few drivers would have to be made), it would revolutionize computing from the ground up. And I think people would be amazed at just how fast programs can be,when written to be efficient – rather than the onslaught of bloat coming out of Redmond (not to mention Ubuntu which is becoming more and more useless).

The benefit for Hyperion Entertainment, which has done a phenomenal job in upgrading AmigaOS to the modern age, are many:

  • Increased sales of the operating system
  • Sale of merchandize surrounding the AmigaOS brand
  • Sale of SDK and associated development tools
  • The establishment of a codebase for OS 4 that is modern

If we take it one step further and look at what would be the next natural step:

  • Team up with case producers to deliver a more “normal size” case for the PI with keyboard
  • Team up with Cloanto to ship the old ROM files for the built-in 68k emulation layer

The point of all this? To build up money. Enough money for Amiga Inc, Commodore or Hyperion to buy time. Time enough for the codebase to grow and become relevant in the marketplace. Once established, to further sale of a dedicated Amiga HW platform (preferably ARM or X86) and secure the investment the owners have made over the years.

FPGA, the beast of xmas future

FPGA (field programmable gate array) is the future. I don’t care how proud you are of your Intel i7 processor (I have a couple of those myself). Mark my words: 20 years from now you will be blazing behind your FPGA based computer. And I have no doubt that games and applications will adapt the hardware to their needs – with possibilities we can’t even dream about today; let alone define.


Mist. A low-cost FPGA computer capable of becoming an Amiga (and about 10 other platforms). The closest thing to new Amiga hardware to be created in 20 years.

Todays processors are fixed. They have a fixed architecture that is written silicon and copper. Once cooked they cannot be altered in any way. Nanotubes is just about to drop, but again the nature of fixed systems – is that they cannot be altered once cooked.

FPGA however is based on gate logic. Which means (simply put) that the relations that make up the internal architecture is fluid, like a million doors that can be opened or closed to create all manner of living space. In many ways its like a virus, capable of absorbing existing blueprints and becoming “that blueprint”. If we dip into sci-fi for a moment this is the computer variation of a xenomorph, a shape shifter. A creature that can adapt and alter itself to become any other thing.

As of writing this tech is in its infancy. It’s just getting out there and the prices and speed of these spectacular devices bears witness to its age and cost of production. If you want a FPGA with some kick in it, you better be prepared to take out a second mortgage your house.


The Vampire 2 accelerator for Amiga 600. This takes over and gives the Amiga so much power that it can play movies, modern music and 3d games faster than ever imagined. At the same time! In fact, I bought an A600 just to get this!

One of the cool things about this hardware is how it’s being used today. One of the first hardware platforms to be devised for FPGA was (drumroll) the Amiga. And you have to understand that we are not talking just the 68k cpu here – but the whole bloody thing: paula, agnus, deniese, fat agnus and the whole crew of chips that made the Amiga so popular in the first place. All of it coded in gate-logic and uploaded to a cpu that with a flick of a switch can turn right around and become an x86 pc, a PPC Mac, a Commodore 64, a Nintendo 64 or whatever tickles your fancy.

Lets stop and think about this.

Today we use virtual machines to mimic or translate bytecode (or pre-existing cpu instructions). We call these solutions by many names: virtual machine, emulator, runtime – but its all the same really. Even if you slap a JIT (just in time compilation) into the mix, which is the case of both emulators, Java and .NET compilers — it still boils down to running an imaginary (or pre-defined) platform under the umbrella of a fixed piece of kit.

Now what do you think would be the next logical step in that evolution?

The answer is naturally being able to sculpt virtual machines in hardware (read: fixed hardware that gives you a non-fixed field). Fixed processors is a costly process. Yet primitive when we really look at it. We may have shrunk the brilliance of the 1800’s and early 1900’s and made all the great inventions of the past fit on the head of a pin — but its all based on the same stuff: physical implementation. Someone has to sit there with a microscope and hammer the damn circuits out (although “hammer” is maybe the wrong word on particle level).

This is also the problem with western culture: the use and throw away mentality that creates mountains of technological garbage – and powers child labour and sickness beyond belief in poor parts of the world. You have six years old kids that work with melting out copper and quicksilver. A waste of life, tech and nature. So yeah, a bit of morality in this as well.

FPGA represents, really, the first actual breakthrough and paradigm shift since the invention of the first electric circuit. For the first time in history a medium has been created that is not fixed. It has to be created of course, and it’s not like its wet-wire technology or anything — but for the first time anyone with the skill to code the gates, can shape and adapt the hardware without the need to cook the chips first.

And they can be infinitely re-used, which is good for both people and nature.

Think about it.. then go “holy cow”.

And that my friend – is the thought of the day!


Smart Pascal, Pastafari PI, Amiga and all

September 9, 2016 Leave a comment

Hectic days so I dont have as much time to blog as I used to. Also found a new interest in electronics (total newbie, but i love it) so I jump from one thing to the next.

NodeJS, assembly and virtual machine

Whenever I have time I try to work as much on Smart Mobile Studio as I can. I keep working at a steady pace on the RTL.

At the same time I am also making headway on the assembler written in Smart Pascal. It is quite important for some of the future plans, and to be perfectly honest – it’s pretty cool! Basically I have mixed classic Acorn, MC68030 and x86 assembly language, adapted it to JavaScript (so no pointers, only references and offsets). The instruction set is fairly standard:

Unlike Java or CLR, this one is register oriented. One of the major weaknesses of Java is how it pushes everything on the stack, making calls slower than it has to be.

  • 32 data agnostic cpu registers
  • stack and frame
  • Program counter (PC)
  • Variable management
  • Inline constants
  • Asm compiles to codesegment assembly format
    • Support for const resource chunk
    • Instance frame oriented

The instruction set thus far is very fundamental. It contains instructions you will find on any processor (more or less).


Parameters always start with the destination (destination, source). Most instructions support all 3 modus operandi (register, constant or resource identifier and inline data).

Inline data instructs the cpu to read a data segment directly following the instruction. For example, this is a perfectly valid assembly snippet:

  LDD R0, "This is a string "
  MOV R1, R0
  ADD R0, R1
  ; r0 now contains "This is a string This is a string"

You can however put that string (which is a constant) into the resource chunk of the bytecode format. Then you can reference it by id:

  ; here presuming the string has the id $200
  LDC R0, C[$200]
  MOV R1, R0
  ADD R0, R1
  ; Same result as above
  • Alloc [identifier]
  • Free [identifier]
  • LD.C [register], [resource id]
  • LD.V [register], [variable id]
  • LD.D [register], [inline data]
  • PSH.C [resource id]
  • PSH.R [register]
  • PSH.D [inline data]
  • POP.R [register]
  • POP.V [variable id]
  • MOV.R [register], [register]
  • MOV.V [variable id], [register]
  • MOV.D [variable id], [inline data]
  • ADD.C [register], [resource id]
  • ADD.V [register], [variable id]
  • ADD.D [register], [inline data]
  • SUB.C [register], [resource id]
  • SUB.V [register], [variable id]
  • SUB.D [register], [inline data]
  • JSL [offset]
  • JSE [register], [offset]
  • BNE [offset]
  • BEQ [offset]
  • RTS
  • NOOP
  • CMP.C [register], [resource id]
  • CMP.V [register], [variable id]
  • CMP.D [register], [inline data]
  • MUL.R [register], [register]
  • MUL.C [register], [resource id]
  • MUL.D [register], [inline data]
  • DIV.R [register], [register]
  • DIV.C [register], [resource id]
  • DIV.D [register], [inline data]
  • AND.R [register], [register]
  • AND.C [register], [resource id]
  • AND.D [register], [inline data]
  • OR.R [register], [register]
  • OR.C [register], [resource id]
  • OR.D [register], [inline data]
  • NOT.R [register], [register]
  • NOT.C [register], [resource id]
  • NOT.D [register], [inline data]
  • MULDIV.R [register], [register]
  • MULDIV.C [register], [resource id]
  • MULDIV.D [register], [inline data]
  • XOR.R [register], [register]
  • XOR.C [register], [resource id]
  • XOR.D [register], [inline data]
  • LSR.R [register], [register]
  • LSR.C [register], [resource id]
  • LSR.D [register], [inline data]
  • LSL.R [register], [register]
  • LSL.C [register], [resource id]
  • LSL.D [register], [inline data]
  • MOD.R [register], [register]
  • MOD.C [register], [resource id]
  • MOD.D [register], [inline data]
  • SYSCALL [method id]

The second half is the disassembler – and naturally the most important: namely the CPU or virtual machine. Like the disassembler this decodes the instruction bits and executes each instruction accordingly. At high speeds i might add.

So, what on earth is that good for you ask? Well, I have written it in a way that makes it easy to port it to Delphi and Freepascal. So if you are into creating programming languages, game engines, portable services, emulators or whatnot — then this is a very handy piece of tech.

Once you have a working virtual-machine, you can build the high-level language on top of that. And the fact that its portable and you can execute the code inside your Smart Mobile Studio applications, your NodeJS services — or Windows, Linux and OS X (as long as freepascal is there, you are good), that opens up for some interesting ideas.

Pastafari PI

Those that read my blog know that I absolutely love retro machines. Atari, Commodore 64, Amiga, Acorn — I love them all. I grew up with Zx81, Speccy, C64 and Amiga – so naturally I have so many fun memories with these systems that it’s bound to influence me as an adult.

Now I had a broken Amiga 500 in the basement, and I figured — why fork out $200+ for a PITop or some other solution when I can actually do a retro-mod myself.

After all, distros like Amibian (debian based linux) boot straight into the emulated Amiga environment. And the speed is phenomenal! The PI 3 emulates the Amiga 3.2 times faster than the most high-end Amiga ever created. With overclocking we are looking at speeds up to 4 times faster than a juiced up Amiga 4000\ 060 (!)


Well, it’s not finished yet, but I have basically cut the case and made room for a fancy ADX gaming keyboard with sexy led lighting. I had to solder the keyboard controller to make it fit onto the Amiga keyboard back-plate – and also cut the keyboard quite heavily.


The idea is that when I have all the internals working – i will do a lot of epoxy and plastic work to make it look more authentic. Right now it looks very rough and rugged, but it runs like a dream!

I also bought a cheap 2.5″ multi sd-card reader. That came with an internal USB motherboard socket sadly — so I had to cut and do some soldering to make it into a normal external USB connector. And now it just plugs into the PI.


I also adapted the floppy-drive input on the side of the Amiga, so the SD-card reader now sits in place where the original 2.5″ floppydrive once lived.

I’m just waiting for the sd-card extender circuit so i can adapt the front of the Amiga and have the SD-card slot for the PI there. This will make it a snap to change sd-card whenever I want to use another operating system. Im also soldering up a reset and shutdown switch that will also be on the front (that thin region just above the keyboard).


To experienced technicians this no doubt look like a complete mess – but this is my first ever electronic “hands-on” project. I havent touched a soldering pen since highschool, so it was nerve-wreaking digging into the keyboard controller.


Enjoying some Pastafari PI computing – tremble before his noodly appendages!

The final step is naturally to do some plastic work. I bought a good dose of epoxy for this. Once that is done I have to sand everything down and make the cuts straight and better looking.

And then finally I can give it two coats of black spray paint, and that final coat of transparent paint for hardening. And voila — I’ll have a pretty cool rig to test and work with my Raspberry PI 3!

Amiga Reloaded, can I preorder?

September 1, 2016 14 comments

Without much fanfare some brilliant news made it into the retro-computing scene yesterday, namely that our german superheroes over at Individual Computers Gmbh has aquired the rights to the Commodore name.


A brand new C64 motherboard, still going after all that time

“The Amiga for me represent a whole timeline of computing history that was aborted, a timeline which, had it been allowed to continue, would have given the world a much better experience of computing”

Individual has been shipping their Commodore 64 replacement motherboard for some time, which apparently is a very popular product for people into the C64 scene. I would love to get my hands on it, but while I grew up on a c64 — my computing life basically started with the Commodore Amiga.

It just wont die

The Amiga home computer is paradox wrapped in an enigma. Its been out of production since the 90’s, parts cost more than a used car – yet thousands of people around the world use this (by todays standard) ancient computer platform daily.

So what is it about this computer that simply refuses to die? Why do people, young and old, love this 30 year old computer?

I can only speak for myself, but I think it has to do with the fact that the Amiga was murdered. That is how I feel anyways. It was in the prime of it’s life, and was killed and replaced by backwater, poorly made computers that didn’t deserve to win. So I think maybe, if im honest, its a classic case of martyrdom.

The Amiga for me represent a whole timeline of computing history that was aborted, a timeline which, had it been allowed to continue, would have given the world a much better experience of computing. Not to mention our capabilities as a race with regard to data processing in all avenues of life.


My Raspberry PI 3 Amiga is just fantastic!

I have never seen PC users get into a physical fist-fight over their pentiums; or AMD users bashing Intel users in the head — but I have seen Amiga users go head to head at copy parties, beating the living daylights out of each other. You can’t buy that level of dedication, it has to be earned. I don’t think any other computer enjoy a mass of users that actually love, in the true sense of the word, every inch of their platform.

But the Amiga does.

And those that grew up on the Amiga wont rest until it’s resurrected, which incidentally can now actually become a reality.

Amiga Reloaded

I sent an Email to Individual asking them about the Amiga 1200 and if it was a part of their plan. I mean, having now finally strangled the rights to the Commodore name from the hands of vultures.

I actually got chills when I read their reply:

The A1200 is also on the agenda, yes

I was supposed to get into bed before midnight, but by the time the mental storm had passed I found myself messing around in UAE at 3 o’clock in the morning!

What goodies could a dedicated hardware shop like individual Computers introduce in a new Amiga? In my mind the ultimate reloaded Amiga would be something like this:

  • FGPA running the show
  • Stuff AGA modes into the fpga core, pure chunky out!
  • A solid 512 megabyte of memory would be nice
  • HDMI out
  • USB for mouse
  • Sata port
  • WiFi on chip

The above list is just my hopes for what an A1200 Reloaded could look like. But to be perfectly honest I would be happy just being able to buy a slightly pumped up A1200 at a reasonable price. If nothing else than to stick it to the morons on ebay charging $6000 for an Amiga 1000 (its gotten way passed ridicules).

Updated: The specs

Stefan Egger pointed out that a draft of the specs are online, and sadly (if this is the working draft) it seems poor compared to my hopes. But the article does start with “The following is a preliminary specification. Things may change”. Head over to read the specs.

But its still good news I think. If nothing more than to at least break the monopoly that is going on at ebay. But for me personally, if this is what they are going for, I will probably have to order a Vampire 2 accellerator before I even want to get near it.

I was seriously hoping for a “minimig” fpga in the 120-200 mhz range. Just drop the custom chips and drump that into the fpga core — which would make for a very resilient computer limited only by gate-speed and overal performance.

Oh well, at least the spare parts problem is about to solve itself 🙂