Archive
JSON structures in Smart Pascal
JSON is more or less built into the JavaScript virtual machine [JSVM] which makes it very convenient to use. A simple call to JSON.stringify() is just about all you need to turn any JS object into a pristine JSON structure. And to turn it back again into an instance you just call JSON.parse().
Now arguing that JSON could be made simpler or in any way improved within Smart Mobile Studio would be, well, a bit lame. But what can be done is to abstract the use of JSON a bit. Most SMS coders are working quite close to the JSVM, mixing object pascal with JavaScript. That is one of the strengths of Smart Pascal, but it can also be regarded as a weakness.

It doesnt get easier than this, or does it..
If you ever used Turbo Pascal, High Speed Pascal or any of the “old time” compilers, you no doubt remember what happened when Delphi 1 came out? All that code written for Turbo became obsolete faster than it had to. Why? Because back then we mixed and matched Pascal with assembler. And when you wanted to move that code from DOS to Windows, you could kiss old-school assembler good-bye. The registers didn’t match up, the new platform was 32 bit, the way Delphi and Turbo used the stack and organized instances was different — long story short: you couldn’t just copy the code over.
While JavaScript is not going to change any time soon, it will not stay the same forever. And with that in mind I try my best to abstract the RTL as much as possible from the low-level code that actually get’s the job done. Who knows, one day we may actually turn around and implement that LLVM compiler that’s been floating around 😉
Abstraction
One of the units that is in the new RTL deals with structures. Not just JSON but also pure binary structures just like Delphi or Freepascal would allocate.
So there is an abstract baseclass simply called TW3Structure in the unit “System.Structure.pas” (yeah i know, the names are cunning). And it simply looks like this:
EW3Structure = class(EW3Exception); TW3Structure = class(TObject) public procedure WriteString(Name: string; Value: string; const Encode: boolean); procedure WriteInt(const Name: string; value: integer); procedure WriteBool(const Name: string; value: boolean); procedure WriteFloat(const Name: string; value: float); procedure WriteDateTime(const Name: string; value: TDateTime); function ReadString(const Name: string): string; function ReadInt(const Name: string): integer; function ReadBool(const Name: string): boolean; function ReadFloat(const Name: string): float; function ReadDateTime(const Name: string): TDateTime; function Read(const Name: string): variant;virtual;abstract; procedure Write(const Name: string; const Value: variant);virtual;abstract; procedure Clear;virtual;abstract; procedure SaveToStream(const Stream: TStream);virtual;abstract; procedure LoadFromStream(const Stream: TStream);virtual;abstract; end;
Next you have the JSON implementation in System.Structure.JSON, the XML version in System.Structure.XML – and raw binary in System.Structure.binary.
JSON serialization without baggage
When serializing with JSON we have so far operated either with variants (which you can turn into a JSON object with TVariant.CreateObject) or records. This is because raw serialization affects everything, including the VMT (virtual method table). So if you serialize a TObject based instance, it will include the VMT and whatever state that is in. Which will cause problems when de-serializing the object back depending on the complexity of the class.
In the last major update Eric (our compiler wizard) introduced classes that does not root in TObject. As you probably know, TObject has been the absolute base object in object-pascal for ages. You simply could not create a class that does not inherit from TObject.
Well, that is no longer the case and Smart Pascal allows you to define classes that are not based on TObject at all. This class form serves an important purpose: namely that they can be defined as “external”, so you can map these to JavaScript code directly. Like say a library you have included that exposes objects you can create. If you know the structure, then just define a class for it – and you can create it like any other object (!)
But what about creating pure JS based objects via code? Not defined classes or typed structures, but using code to carve out objects from scratch? Imagine it as being able to define what a record should look like – but at runtime. Sounds interesting? Indeed it does. And without much fuss I give you the TW3JSONObject class.
While somewhat pointless at first sight, it does make writing packets easier for NodeJS services. And extracting info is likewise a snap:
type TW3JSONEnumProc = function (Id: string; Data: variant): boolean; TW3JSONObject = class private function GetPropertyCount: integer; public function ToString: string; procedure FromString(const JSonText: string); function Equals(const OtherObject: TW3JSONObject): boolean; function Compare(const OtherObject: TW3JSONObject): boolean; function &Read(const PropertyName: string): variant; procedure &Write(const PropertyName: string; const PropertyData: variant);overload; function AddOrSet(const PropertyName: string; const PropertyData: variant): TW3JSONObject; function QueryPropertyNames: TStrArray; procedure Clear; function ForEach(const Callback: TW3JSONEnumProc): boolean;overload; function &Contains(const PropertyName: string): boolean;overload; class function ForEach(const ObjReference: variant; const Callback: TW3JSONEnumProc): boolean;overload; class function &Contains(const ObjReference: variant; const PropertyName: string): boolean;overload; property Values[const Id:string]: variant read &Read write &Write;default; property Count:integer read GetPropertyCount; constructor Create;overload;virtual; constructor Create(const ObjReference: JObject);overload;virtual; constructor Create(const FromPrototype: Variant);overload;virtual; destructor Destroy;override; end; {$R 'object.keys.shim.js'} var __RESERVED: array of string = [ '$ClassName', '$Parent', '$Init', 'toString', 'toLocaleString', 'valueOf', 'indexOf', 'hasOwnProperty', 'isPrototypeOf', 'propertyIsEnumerable', 'constructor' ]; //############################################################################ // TW3JSONObject //########################################################################### constructor TW3JSONObject.Create; begin inherited Create; end; constructor TW3JSONObject.Create(const ObjReference: JObject); begin Create(Variant(ObjReference)); end; constructor TW3JSONObject.Create(const FromPrototype: Variant); begin Create; (* perform a non-recursive prototype clone of values *) if FromPrototype.Valid then begin ForEach(FromPrototype, function (Id: string; Data: variant): boolean begin if not (id in __RESERVED) then AddOrSet(Id,Data); result := true; end); end; end; destructor TW3JSONObject.Destroy; begin clear; inherited; end; class function TW3JSONObject.&Contains(const ObjReference: variant; const PropertyName: string): boolean; begin result := ObjectPrototype.hasOwnProperty.call(PropertyName); end; class function TW3JSONObject.ForEach(const ObjReference: variant; const Callback: TW3JSONEnumProc): boolean; begin if assigned(Callback) then begin if ObjReference.Valid then begin if ObjReference.IsObject then begin for var LItem in ObjReference do begin result := Callback(LItem, ObjReference[LItem]); if not result then break; end; end else raise Exception.Create('Enumeration failed, render is not an object error'); end else Raise Exception.Create('Enumeration failed, reference is null or unassigned error'); end; end; function TW3JSONObject.ForEach(const Callback: TW3JSONEnumProc): boolean; begin if assigned(callback) then begin for var LItem in ThisContext do begin result := Callback(LItem, ThisContext[LItem]); if not result then break; end; end; end; procedure TW3JSONObject.Clear; begin var LItems := QueryPropertyNames; for var LProp in LItems do begin asm delete this[@LProp]; end; end; end; function TW3JSONObject.QueryPropertyNames: TStrArray; begin (* Perform manual enumeration *) for var LItem in ThisContext do begin // Avoid standard items if not (LItem in __RESERVED) then begin if not (Variant(ThisContext[LItem]).Datatype in [vdUnknown, vdFunction]) then begin if ThisContext.hasOwnProperty.call(ThisContext,LItem) then result.add(LItem); end; end; end; end; function TW3JSONObject.GetPropertyCount: integer; begin if (ObjectPrototype.keys) then begin (* Object.Keys() is supported by all modern browsers, but equally important: NodeJS and IO as well. There are however older runtime environments in circulation, so we need fallback mechanisms to ensure behavior. *) result := ObjectPrototype.keys.call(ThisContext).length; end else begin for var LItem in ThisContext do begin if ThisContext.hasOwnProperty.call(ThisContext,LItem) then inc(result); end; end; end; function TW3JSONObject.&Contains(const PropertyName: string): boolean; begin result := thisContext.hasOwnProperty.call(PropertyName); end; function TW3JSONObject.Read(const PropertyName: string): variant; begin result := ThisContext[PropertyName]; end; procedure TW3JSONObject.Write(const PropertyName: string; const PropertyData: variant); begin ThisContext[PropertyName] := PropertyData; end; function TW3JSONObject.AddOrSet(const PropertyName: string; const PropertyData: variant): TW3JSONObject; begin ThisContext[PropertyName] := PropertyData; result := self; end; function TW3JSONObject.ToString: string; begin result := JSON.Stringify(self); end; (* Compare does a property-name and property-value check *) function TW3JSONObject.Compare(const OtherObject: TW3JSONObject): boolean; begin if assigned(otherobject) then begin result := ForEach( function (Name: string; Data: variant): boolean begin result := ObjectPrototype.hasOwnProperty.call(OtherObject,Name) and OtherObject.Read(Name) = Read(Name); end); end; end; (* Equals does a structural compatability check only *) function TW3JSONObject.Equals(const OtherObject: TW3JSONObject): boolean; begin if assigned(otherobject) then begin result := ForEach( function (Name: string; Data: variant): boolean begin result := ObjectPrototype.hasOwnProperty.call(OtherObject,Name) end); end; end; procedure TW3JSONObject.FromString(const JSONText: string); var LTemp: variant; LObj: variant; begin Clear; LObj := JSON.Parse(JSONText); if (LObj) then begin for LTemp in LObj do begin if not (LTemp in __RESERVED) then ThisContext[LTemp] := LObj[LTemp]; end; end; end;
Here is a little example code of how it’s used:
var raw := TW3JSONObject.Create; raw.Write("hello",true); raw.Write("values",1200); raw.Write("SubElement",TW3JSonObject.Create); raw.AddOrSet("Another!",TW3JSonObject.Create) .AddOrSet("Yet another!", TW3JSonObject.Create);
And extracting property information is likewise easy:
var props := raw.QueryPropertyNames.join(","); showmessage(props);
3D mathematics [book]
It’s not often I promote books, but this time I’ll make an exception: Mathematics for 3d programming and computer graphics.

A book worth every penny, even if you dont use 3d graphics very often
No matter if you are a Delphi programmer, Smart Pascal, Freepascal, C# or C++; sooner or later you are going to have to dip your fingers into what we may call “primal coding”. That means coding that was established some time ago, and that have since been isolated and standardized in APIS. This means that if you want to learn it, you are faced with the fact that everyone is teaching you how to use the API — not how to make it or how it works behind the scenes!
3D graphics
Once in a while I go on a retro-computer rant (I know, I know) talking about the good ol’ days. But there is a reason for this! And a good one at that. I grew up when things like 3d graphics didn’t exist. There were no 3d graphics on the Commodore 64 or the Amiga 500. The 80’s and early 90’s were purely 2d. So I have been lucky and followed the evolution of these things long before they became “standard” and isolated away in API’s.
Somewhere around the mid 80’s there was a shift from “top-down 2d graphics” in games and demo coding. From vanilla squares to isometric tiles (actually the first game that used it was Qbert, released in 1982). So rather than building up a level with 32 x 32 pixel squares – you built up games with 128 degrees tilted blocks (or indeed, hexagon shaped tiles).
This was the beginning of “3D for the masses” as we know it because it added a sense of depth to the game world.

Qbert, 1982, isometric view
With isometric graphics you suddenly had to take this depth factor into account. This became quite important when testing collisions between sprites. And it didn’t take long before the classical “X,Y,Z” formulas to become established.
As always, these things already existed (3D computation was common even back in the 70s). But their use in everyday lives were extremely rare. Suddenly 3d went from being the domain of architects and scientists – to being written and deployed by kids in bedrooms and code-shops. This is where the european demo scene really came to our aid.
Back to school
This book is about the math. And it’s explained in such a way that you don’t have to be good in it. Rather than teaching you how to use OpenGL or Direct3D, this book teaches you the basics of 3D rotation, vectors, matrixes and how it all fits together.
Why is this useful? Because if you know something from scratch it makes you a better programmer. It’s like cars. Knowing how to drive is the most important thing, but a mechanic will always have a deeper insight into what the vehicle can and cannot do.

Every facet is explained both as theorem and practical example
This is the book you would want if you were to create OpenGL. Or like me, when you don’t really like math but want to brush up on old tricks. We used this in demo coding almost daily when I was 14-15 years old. But I have forgotten so much of it, and the information is quite hard to find in a clear, organized way.
Now I don’t expect anyone to want to write a new 3D engine, but 3D graphics is not just about that. Take something simple, like how an iPhone application transition between forms. Remember the cube effect? Looking at that effect and knowing some basic 3D formulas and voila, it’s not a big challenge to recreate it in Delphi, C++, C# or whatever language you enjoy the most.
I mean, adding OpenGL or WebGL dependencies just to spin a few cubes or position stuff in 3D space? That’s overkill. It’s actually less than 200 lines of code.
Well, im not going to rant on about this — this book is a keeper!
A bit expensive, but its one of those books that will never go out of date and the information inside is universal and timeless.
Enjoy!
The case for Raspberry PI, FPGA and AmigaOS 4.1
First, thanks for so many good comments on the whole Amiga retro-emulation concepts. I think there is a sort of resurgence today of the whole retro gear thing. On Facebook the Amiga forums and groups are growing, and there is really a sense of community there. Something I havent experienced with the Amiga for well over a decade (or was it two?).
To those that grew up without an Amiga we “old timers” must seem nuts. But that is to be expected by generations growing up with 24/7 internet connections. I’m not dizzing young programmers in any way, absolutely not; but I will make a case that you are missing out on something very valuable in terms of learning and evolving your skill.
“It’s just that it’s based on pre-existing hardware, not an imaginary instruction-set that
assaults the stack while raping the instruction cache”
The term “personal computer” (PC) doesnt really have any meaning today. I see that with my son as well. He has no personal relationship with his computer at all. The computer is a means to an end for him and his friends – a portal if you like, to the reality on the internet. Be it steam, Photoshop express, chatting or whatever. Young coders sort of have a split reality, where their friends online that they have never meet plays a bigger role in their lives than, well, their best friend across the street.

Classic Amiga Workbench running under Amibian
People who grew up without the internet had only their computer to deal with. It was the center of music, demos, games and creativity. Be it coding, graphics, sound or whatever was the interest. The result was naturally that you created bonds to that computer that, to other people, could seem odd or silly. But the phrase “personal computer” is not just a throwback to the time when you no longer needed a campus mainframe or terminal. It also hints to a more personal approach to technology. Which is easy to forget in an age where you switch mobile phones once a year, and the average phone has more computing power than was on the planet in the 1970’s.
Amiga emulation; why it’s a good thing
If we forget the visual aspects of the grey “classical” Amiga OS for a moment and put the looks on the backburner — why on earth should a modern programmer or computing enthusiast even consider Amiga OS? What could a 30-year-old tech bring to a modern world of high-powered CPU and GPU driven monsters?
In a word: efficiency.
AmigaOS thrives with just one megabyte of memory. Stop and think about that for a moment. The core operating system itself resides in a 512kb (half a megabyte) ROM – and the rest fits nicely on a couple of 720kb disks. So if we say that a full desktop experience can fit in 4-5 megabytes (if we include the programs, accessories and extras), what does that tell you?
It should tell you something about how the code is written. But secondly it should tell you about how we write code today (!)
“You think Linux is a fast and efficient operating system? You don’t have a clue”
An average Microsoft Windows installation is what? 16 gigabytes? You can probably trim it down to 8 gigabytes by removing services, graphics and drivers you don’t use. There is also a huge difference in the size of executables and the amount of information stored in the binaries — but ultimately it comes down to a shift in mindset that occurred back in the late 90’s: rather than forcing coders to write efficient programs, the emphasis was placed on the hardware to deliver enough power to run crap and bloated code.
Now being a programmer myself I have no illusions that if AmigaOS, this time the modern and latest 4.x version, was ever re-compiled for x86 it would naturally result in bigger binaries. Depending on the amount of drivers, you would probably end up with at least 512 megabytes to 1 gigabyte of software. Add presentation and media to that and we are quickly breaching the 1.5 to 2 gigabyte boundary. But that in itself would be revolutionary compared to the size of Ubuntu or Windows. Yet the core of the operating system is so small that many young developers find it hard to believe.
And yes I know the Linux kernel and base packages can be squeezed down. But in all honesty, Amiga has much more interesting system. Some of the stuff you can do with shell scripting and Arexx on the Amiga, the lack of cryptic complexity, the ease of use and control you as an end-user had; im sorry but Linux is anything but user-friendly.
Why Raspberry PI
By any modern measure, the Raspberry PI is an embedded board at best, and a toy at worst. It exists there between the cusps of single-function boards and a modern computer. But is it really that bad? Actually, its bloody smashing. It’s just that people havent really been able to run anything written specifically for it yet.
Amibian, a debian based distro that boots straight into UAE (Unix Amiga emulator) and runs classical 16/32 bit Amiga OS, presently performs at 3.2 times the speed if an Amiga 4000\o60. So for $35 you will own the most powerful Amiga ever devised. If you take it one step further and overclock the PI (and add a heat-sink so you don’t burn out the SoC) it emulates the Amiga operating system roughly 4 times the speed of the flagship high-end Amiga of the late 90’s and early 2k’s. You also get 32bit graphics, HDMI output, USB device access through the Linux sub-layer, built-in tcp/ip (and WiFi built-in on the model 3b). And naturally: a hell of a lot more ram than the Amiga even needs (!)
Now remember, this is emulated on 68k instruction level (!) It is practically the same as running Java or CLR bytecodes (!) Which is a good parallell. People ask me why i bother with 68k; My reply is: why the hell do you bother with Java bytecodes if you don’t have a clue what a virtual machine is! An emulator is a virtual machine in the true sense of the phrase. It’s just that it’s based on pre-existing hardware, not an imaginary instruction-set that assaults the stack while raping the instruction cache (yeah I’m looking at you Java!).
Imagine then for a second what the situation would be if Amiga OS was compiled for Arm, running natively on the Raspberry PI with direct access to everything. You think Linux is a fast and efficient operating system? You don’t have a clue.
I mean, the PI was initially created to deliver cheap computing power to schools and educational centers, not to mention third-world countries. It made big waves as it blew the ridicules “$100 one PC per child” campagne out of the water (which was more an insult to the poor living in Africa than anything resembling help). Yet at the end of the day – what do these third world countries have to work with? Raspbian and Ubuntu are usable, but only superficially.
Try compiling something on the PI with a modern compiler. What would take less than a second to compile under Amiga OS can take up to 45 minutes to build under Linux on that SoC. If a kid in Africa starts learning C++ with Linux on a PI, he will be 59 years old before he can even apply for a job.
If AmigaOS 4 was ever compiled and setup to match the SoC firmware (which is also a benefit about the PI, the hardware is fixed and very few drivers would have to be made), it would revolutionize computing from the ground up. And I think people would be amazed at just how fast programs can be,when written to be efficient – rather than the onslaught of bloat coming out of Redmond (not to mention Ubuntu which is becoming more and more useless).
The benefit for Hyperion Entertainment, which has done a phenomenal job in upgrading AmigaOS to the modern age, are many:
- Increased sales of the operating system
- Sale of merchandize surrounding the AmigaOS brand
- Sale of SDK and associated development tools
- The establishment of a codebase for OS 4 that is modern
If we take it one step further and look at what would be the next natural step:
- Team up with case producers to deliver a more “normal size” case for the PI with keyboard
- Team up with Cloanto to ship the old ROM files for the built-in 68k emulation layer
The point of all this? To build up money. Enough money for Amiga Inc, Commodore or Hyperion to buy time. Time enough for the codebase to grow and become relevant in the marketplace. Once established, to further sale of a dedicated Amiga HW platform (preferably ARM or X86) and secure the investment the owners have made over the years.
FPGA, the beast of xmas future
FPGA (field programmable gate array) is the future. I don’t care how proud you are of your Intel i7 processor (I have a couple of those myself). Mark my words: 20 years from now you will be blazing behind your FPGA based computer. And I have no doubt that games and applications will adapt the hardware to their needs – with possibilities we can’t even dream about today; let alone define.

Mist. A low-cost FPGA computer capable of becoming an Amiga (and about 10 other platforms). The closest thing to new Amiga hardware to be created in 20 years.
Todays processors are fixed. They have a fixed architecture that is written silicon and copper. Once cooked they cannot be altered in any way. Nanotubes is just about to drop, but again the nature of fixed systems – is that they cannot be altered once cooked.
FPGA however is based on gate logic. Which means (simply put) that the relations that make up the internal architecture is fluid, like a million doors that can be opened or closed to create all manner of living space. In many ways its like a virus, capable of absorbing existing blueprints and becoming “that blueprint”. If we dip into sci-fi for a moment this is the computer variation of a xenomorph, a shape shifter. A creature that can adapt and alter itself to become any other thing.
As of writing this tech is in its infancy. It’s just getting out there and the prices and speed of these spectacular devices bears witness to its age and cost of production. If you want a FPGA with some kick in it, you better be prepared to take out a second mortgage your house.

The Vampire 2 accelerator for Amiga 600. This takes over and gives the Amiga so much power that it can play movies, modern music and 3d games faster than ever imagined. At the same time! In fact, I bought an A600 just to get this!
One of the cool things about this hardware is how it’s being used today. One of the first hardware platforms to be devised for FPGA was (drumroll) the Amiga. And you have to understand that we are not talking just the 68k cpu here – but the whole bloody thing: paula, agnus, deniese, fat agnus and the whole crew of chips that made the Amiga so popular in the first place. All of it coded in gate-logic and uploaded to a cpu that with a flick of a switch can turn right around and become an x86 pc, a PPC Mac, a Commodore 64, a Nintendo 64 or whatever tickles your fancy.
Lets stop and think about this.
Today we use virtual machines to mimic or translate bytecode (or pre-existing cpu instructions). We call these solutions by many names: virtual machine, emulator, runtime – but its all the same really. Even if you slap a JIT (just in time compilation) into the mix, which is the case of both emulators, Java and .NET compilers — it still boils down to running an imaginary (or pre-defined) platform under the umbrella of a fixed piece of kit.
Now what do you think would be the next logical step in that evolution?
The answer is naturally being able to sculpt virtual machines in hardware (read: fixed hardware that gives you a non-fixed field). Fixed processors is a costly process. Yet primitive when we really look at it. We may have shrunk the brilliance of the 1800’s and early 1900’s and made all the great inventions of the past fit on the head of a pin — but its all based on the same stuff: physical implementation. Someone has to sit there with a microscope and hammer the damn circuits out (although “hammer” is maybe the wrong word on particle level).
This is also the problem with western culture: the use and throw away mentality that creates mountains of technological garbage – and powers child labour and sickness beyond belief in poor parts of the world. You have six years old kids that work with melting out copper and quicksilver. A waste of life, tech and nature. So yeah, a bit of morality in this as well.
FPGA represents, really, the first actual breakthrough and paradigm shift since the invention of the first electric circuit. For the first time in history a medium has been created that is not fixed. It has to be created of course, and it’s not like its wet-wire technology or anything — but for the first time anyone with the skill to code the gates, can shape and adapt the hardware without the need to cook the chips first.
And they can be infinitely re-used, which is good for both people and nature.
Think about it.. then go “holy cow”.
And that my friend – is the thought of the day!
Amiga Reloaded, can I preorder?
Without much fanfare some brilliant news made it into the retro-computing scene yesterday, namely that our german superheroes over at Individual Computers Gmbh has aquired the rights to the Commodore name.

A brand new C64 motherboard, still going after all that time
“The Amiga for me represent a whole timeline of computing history that was aborted, a timeline which, had it been allowed to continue, would have given the world a much better experience of computing”
Individual has been shipping their Commodore 64 replacement motherboard for some time, which apparently is a very popular product for people into the C64 scene. I would love to get my hands on it, but while I grew up on a c64 — my computing life basically started with the Commodore Amiga.
It just wont die
The Amiga home computer is paradox wrapped in an enigma. Its been out of production since the 90’s, parts cost more than a used car – yet thousands of people around the world use this (by todays standard) ancient computer platform daily.
So what is it about this computer that simply refuses to die? Why do people, young and old, love this 30 year old computer?
I can only speak for myself, but I think it has to do with the fact that the Amiga was murdered. That is how I feel anyways. It was in the prime of it’s life, and was killed and replaced by backwater, poorly made computers that didn’t deserve to win. So I think maybe, if im honest, its a classic case of martyrdom.
The Amiga for me represent a whole timeline of computing history that was aborted, a timeline which, had it been allowed to continue, would have given the world a much better experience of computing. Not to mention our capabilities as a race with regard to data processing in all avenues of life.

My Raspberry PI 3 Amiga is just fantastic!
I have never seen PC users get into a physical fist-fight over their pentiums; or AMD users bashing Intel users in the head — but I have seen Amiga users go head to head at copy parties, beating the living daylights out of each other. You can’t buy that level of dedication, it has to be earned. I don’t think any other computer enjoy a mass of users that actually love, in the true sense of the word, every inch of their platform.
But the Amiga does.
And those that grew up on the Amiga wont rest until it’s resurrected, which incidentally can now actually become a reality.
Amiga Reloaded
I sent an Email to Individual asking them about the Amiga 1200 and if it was a part of their plan. I mean, having now finally strangled the rights to the Commodore name from the hands of vultures.
I actually got chills when I read their reply:
“The A1200 is also on the agenda, yes“
I was supposed to get into bed before midnight, but by the time the mental storm had passed I found myself messing around in UAE at 3 o’clock in the morning!
What goodies could a dedicated hardware shop like individual Computers introduce in a new Amiga? In my mind the ultimate reloaded Amiga would be something like this:
- FGPA running the show
- Stuff AGA modes into the fpga core, pure chunky out!
- A solid 512 megabyte of memory would be nice
- HDMI out
- USB for mouse
- Sata port
- WiFi on chip
The above list is just my hopes for what an A1200 Reloaded could look like. But to be perfectly honest I would be happy just being able to buy a slightly pumped up A1200 at a reasonable price. If nothing else than to stick it to the morons on ebay charging $6000 for an Amiga 1000 (its gotten way passed ridicules).
Updated: The specs
Stefan Egger pointed out that a draft of the specs are online, and sadly (if this is the working draft) it seems poor compared to my hopes. But the article does start with “The following is a preliminary specification. Things may change”. Head over to
http://wiki.icomp.de/wiki/Amiga_reloaded
..to read the specs.
But its still good news I think. If nothing more than to at least break the monopoly that is going on at ebay. But for me personally, if this is what they are going for, I will probably have to order a Vampire 2 accellerator before I even want to get near it.
I was seriously hoping for a “minimig” fpga in the 120-200 mhz range. Just drop the custom chips and drump that into the fpga core — which would make for a very resilient computer limited only by gate-speed and overal performance.
Oh well, at least the spare parts problem is about to solve itself 🙂
You must be logged in to post a comment.