Archive

Posts Tagged ‘C#’

Nodebuilder, QTX and the release of my brand new social platform

January 2, 2020 9 comments

First, let me wish everyone a wonderful new year! With xmas and the silly season firmly behind us, and my batteries recharged – I feel my coding fingers itch to get started again.

2019 was a very busy year for me. Exhausting even. I have juggled both a full time job, kids and family, as well as our community project, Quartex Media Desktop. And since that project has grown considerably, I had to stop and work on the tooling. Which is what NodeBuilder is all about.

I have also released my own social media platform (see further down). This was initially scheduled for Q4 2020, but Facebook pissed me off something insanely, so I set it up in december instead.

NodeBuilder

node

For those of you that read my blog you probably remember the message system I made for the Quartex Desktop, called Ragnarok? This is a system for dealing with message dispatching and RPC, except that the handling is decoupled from the transport medium. In other words, it doesnt care how you deliver a message (WebSocket, UDP, REST), but rather takes care of serialization, binary data and security.

All the back-end services that make up the desktop system, are constructed around Ragnarok. So each service exposes a set of methods that the core can call, much like a normal RPC / SOAP service would. Each method is represented by a request and response object, which Ragnarok serialize to a JSON message envelope.

In our current model I use WebSocket, which is a full duplex, long-term connection (to avoid overhead of having to connect and perform a handshake for each call). But there is nothing in the way of implementing a REST transport layer (UDP is already supported, it’s used by the Zero-Config system. The services automatically find each other and register, as long as they are connected to the same router or switch). For the public service I think REST makes more sense, since it will better utilize the software clustering that node.js offers.

nodebuilder

Node Builder is a relatively simple service designer, but highly effective for our needs

 

Now for small services that expose just a handful of methods (like our chat service), writing the message classes manually is not really a problem. But the moment you start having 20 or 30 methods – and need to implement up to 60 message classes manually – this approach quickly becomes unmanageable and prone to errors. So I simply had to stop before xmas and work on our service designer. That way we can generate the boilerplate code in seconds rather than days and weeks.

While I dont have time to evolve this software beyond that of a simple service designer (well, I kinda did already), I have no problem seeing this system as a beginning of a wonderful, universal service authoring system. One that includes coding, libraries and the full scope of the QTX runtime-library.

In fact, most of the needed parts are in the codebase already, but not everything has been activated. I don’t have time to build both a native development system AND the development system for the desktop.

nodebuilder4

NodeBuilder already have a fully functional form designer and code editor, but it is dormant for now due to time restrictions. Quartex Media Desktop comes first

But right now, we have bigger fish to fry.

Quartex Media Desktop

We have made tremendous progress on our universal desktop environment, to the point where the baseline services are very close to completion. A month should be enough to finish this unless something unforeseen comes up.

desktop

Quartex Media Desktop provides an ecosystem for advanced web applications

You have to factor in that, this project has only had weekends and the odd after work hours allocated for it. So even though we have been developing this for 12 months, the actual amount of days is roughly half of that.

So all things considered I think we have done a massive amount of code in such a short time. Even simple 2d games usually take 2 years of daily development, and that includes a team of at least 5 people! Im a single developer working in my spare time.

So what exactly is left?

The last thing we did before xmas was upon us, was to throw out the last remnants of Smart Mobile Studio code. The back-end services are now completely implemented in our own QTX runtime-library, which has been written from scratch. There is not a line of code from Smart Mobile Studio in QTX, which means we no longer have to care what that system does or where it goes.

To sum up:

  • Push all file handling code out of the core
  • Implement file-handling as it’s own service

Those two steps might seem simple enough, but you have to remember that the older code was based on the Linux path system, and was read-only.

So when pushing that code out of the core, we also have to add all the functionality that was never implemented in our prototype.

nodebuilder2

Each class actually represents a separate “mini” program, and there are still many more methods to go before we can put this service into production.

Since Javascript does not support threads, each method needs to be implemented as a separate program. So when a method is called, the file/task manager literally spawns a new process just for that task. And the result is swiftly returned back to the caller in async manner.

So what is ultimately simple, becomes more elaborate if you want to do it right. This is the price we pay for universality and a cluster enabled service-stack.

This is also why I have put the service development on pause until we have finished the NodeBuilder tooling. And I did this because I know by experience that the moment the baseline is ready, both myself and users of the system is going to go “oh we need this, and that and those”. Being able to quickly design and auto-generate all the boilerplate code will save us months of work. So I would rather spend a couple of weeks on NodeBuilder than wasting months having to manually write all that boilerplate code down the line.

What about the QTX runtime-library?

Writing an RTL from scratch was not something I could have anticipated before we started this project. But thankfully the worst part of this job is already finished.

The RTL is divided into two parts:

  • Non Visual code. Classes and methods that makes QTX an RTL
  • Visual code. Custom Controls + standard controls (buttons, lists etc)
  • Visual designer

As you can see, the non-visual aspect of the system is finished and working beautifully. It’s a lot faster than the code I wrote for Smart Mobile Studio (roughly twice as fast on average). I also implemented a full visual designer, both as a Delphi visual component and QTX visual component.

nodebuilder3

Quartex Media Desktop makes running on several machines [cluster] easy and seamless

So fundamental visual classes like TCustomControl is already there. What I haven’t had time to finish are the standard-controls, like TButton, TListBox, TEdit and those type of visual components. That will be added after the release of QTX, at which point we throw out the absolute last remnants of Smart Mobile Studio from the client (HTML5 part) software too.

Why is the QTX Runtime-Library important again?

When the desktop is out the door, the true work begins! The desktop has several roles to play, but the most important aspect of the desktop – is to provide an ecosystem capable of hosting web based applications. Offering features and methods traditionally only found in Windows, Linux or OS X. It truly is a complete cloud system that can scale from a single affordable SBC (single board computer), to a high-end cluster of powerful servers.

Clustering and writing distributed applications has always been difficult, but Quartex Media Desktop makes it simple. It is no more difficult for a user to work on a clustered system, as it is to work on a normal, single OS. The difficult part has already been taken care of, and as long as people follow the rules, there will be no issues beyond ordinary maintenance.

And the first commercial application to come out of Quartex Components, is Cloud Forge, which is the development system for the platform. It has the same role as Visual Studio for Windows, or X Code for Apple OS X.

78498221_438784840394351_7041317054627971072_n

The Quartex Media Desktop Cluster cube. A $400 super computer

I have prepared 3 compilers for the system already. First there is C/C++ courtesy of Clang. So C developers will be able to jump in and get productive immediately. The second compiler is freepascal, or more precise pas2js, which allows you to compile ordinary freepascal code (which is highly Delphi compatible) to both JavaScript and WebAssembly.

And last but not least, there is my fork of DWScript, which is the same compiler that Smart Mobile Studio uses. Except that my fork is based on the absolute latest version, and i have modified it heavily to better match special features in QTX. So right out of the door CloudForge will have C/C++, two Object Pascal compilers, and vanilla Javascript and typescript. TypeScript also has its own WebAssembly compiler, so doing hard-core development directly in a browser or HTML5 viewport is where we are headed.

Once the IDE is finished I can finally, finally continue on the LDEF bytecode runtime, which will be used in my BlitzBasic port and ultimately replace both clang, freepascal and DWScript. As a bonus it will emit native code for a variety of systems, including x86, ARM, 68k [including 68080] and PPC.

This might sound incredibly ambitious, if not impossible. But what I’m ultimately doing here -is moving existing code that I already have into a new paradigm.

The beauty of object pascal is the sheer size and volume of available components and code. Some refactoring must be done due to the async nature of JS, but when needed we fall back on WebAssembly via Freepascal (WASM executes linear, just like ordinary native code does).

A brand new social platform

During december Facebook royally pissed me off. I cannot underline enough how much i loath A.I censorship, and the mistakes that A.I does – in which you are utterly powerless to complain or be heard by a human being. In my case i posted a gif from their own mobile application, of a female body builder that did push-ups while doing hand-stands. In other words, a completely harmless gif with strength as the punchline. The A.I was not able to distinguish between a leotard and bare-skin, and just like that i was muted for over a week. No human being would make such a ruling. As an admin of a fairly large set of groups, there are many cases where bans are the result. Disgruntled members that acts out of revenge and report technical posts about coding as porn or offensive. Again, you are helpless because there are nobody you can talk to about resolving the issue. And this time I had enough.

It was always planned that we would launch our own social media platform, an alternative to Facebook aimed at adult geeks rather than kids (Facebook operates with an age limit of 12 years). So instead of waiting I rushed out and set up a brand new social network. One where those banale restrictions Facebook has conditioned us with, does not apply.

Just to underline, this is not some simple and small web forum. This is more or less a carbon copy of Facebook the way it used to be 8-9 years ago. So instead of having a single group on facebook, we can now have as many groups as we like, on a platform that looks more or less identical to Facebook – but under our control and human rules.

AD1

Amigadisrupt.com is a brand new social media platform for geeks

You can visit the site right now at https://www.amigadisrupt.com. Obviously the major content on the platform right now is dominated by retro computing – but groups like Delphi Developer and FPC developer has already been setup and are in use. But if you are expecting thousands of active users, that will take time. We are now closing in on 250 active users which is pretty good for such a short period of time. I dont want a platform anywhere near as big as FB. The goal is to get 10k users and have a stable community of coders, retro geeks, builders and creative individuals.

AD (Amiga Disrupt) will be a standard application that comes with Quartex Media Desktop. This is the beauty of web technology, in that it can unify different resources under one roof. And we will have our cake and eat it come hell or high water.

Disclaimer: Amiga Disrupt has a lower age limit of 18 years. This is a platform meant for adults. Which means there will be profanity, jokes that would get you banned on Facebook and content that is not meant for kids. This is hacker-land, and political correctness is considered toilet paper. So if you need social toffery like FB and Twitter deals with, you will be kicked by one of the admins.

After you sign up your feed will be completely empty. Here is how to get it started. And feel free to add me to your friends-list!thumb

Hydra, what’s the big deal anyway?

October 29, 2019 7 comments

RemObjects Hydra is a product I have used for years in concert with Delphi, and like most developers that come into contact with RemObjects products – once the full scope of the components hit you, you never want to go back to not using Hydra in your applications.

Note: It’s easy to dismiss Hydra as a “Delphi product”, but Hydra for .Net and Java does the exact same thing, namely let you mix and match modules from different languages in your programs. So if you are a C# developer looking for ways to incorporate Java, Delphi, Elements or Freepascal components in your application, then keep reading.

But let’s start with what Hydra can do for Delphi developers.

What is Hydra anyways?

Hydra is a component package for Delphi, Freepascal, .Net and Java that takes plugins to a whole new level. Now bear with me for a second, because these plugins is in a completely different league from anything you have used in the past.

In short, Hydra allows you to wrap code and components from other languages, and use them from Delphi or Lazarus. There are thousands of really amazing components for the .Net and Java platforms, and Hydra allows you compile those into modules (or “plugins” if you prefer that); modules that can then be used in your applications like they were native components.

hydra-01-overview

Hydra, here using a C# component in a Delphi application

But it doesn’t stop there; you can also mix VCL and FMX modules in the same application. This is extremely powerful since it offers a clear path to modernizing your codebase gradually rather than doing a time consuming and costly re-write.

So if you want to move your aging VCL codebase to Firemonkey, but the cost of having to re-write all your forms and business logic for FMX would break your budget -that’s where Hydra gives you a second option: namely that you can continue to use your VCL code from FMX and refactor the application in your own tempo and with minimal financial impact.

The best of all worlds

Not long ago RemObjects added support for Lazarus (Freepascal) to the mix, which once again opens a whole new ecosystem that Delphi, C# and Java developers can benefit from. Delphi has a lot of really cool components, but Lazarus have components that are not always available for Delphi. There are some really good developers in the Freepascal community, and you will find hundreds of components and classes (if not thousands) that are open-source; For example, Lazarus has a branch of Synedit that is much more evolved and polished than the fork available for Delphi. And with Hydra you can compile that into a module / plugin and use it in your Delphi applications.

This is also true for Java and C# developers. Some of the components available for native languages might not have similar functionality in the .Net world, and by using Hydra you can tap into the wealth that native languages have to offer.

As a Delphi or Freepascal developer, perhaps you have seen some of the fancy grids C# and Java coders enjoy? Developer Express have some of the coolest components available for any platform, but their focus is more on .Net these days than Delphi. They do maintain the control packages they have, but compared to the amount of development done for C# their Delphi offerings are abysmal. So with Hydra you can tap into the .Net side of things and use the latest components and libraries in your Delphi applications.

Financial savings

One of coolest features of Hydra, is that you can use it across Delphi versions. This has helped me leverage the price-tag of updating to the latest Delphi.

It’s easy to forget that whenever you update Delphi, you also need to update the components you have bought. This was one of the reasons I was reluctant to upgrade my Delphi license until Embarcadero released Delphi 10.2. Because I had thousands of dollars invested in components – and updating all my licenses would cost a small fortune.

So to get around this, I put the components into a Hydra module and compiled that using my older Delphi. And then i simply used those modules from my new Delphi installation. This way I was able to cut cost by thousands of dollars and enjoy the latest Delphi.

hydramix

Using Firemonkey controls under VCL is easy with Hydra

A couple of years back I also took the time to wrap a ton of older components that work fine but are no longer maintained or sold. I used an older version of Delphi to get these components into a Hydra module – and I can now use those with Delphi 10.3 (!). In my case there was a component-set for working closely with Active Directory that I have used in a customer’s project (and much faster than having to go the route via SQL). The company that made these don’t exist any more, and I have no source-code for the components.

The only way I could have used these without Hydra, would be to compile them into a .dll file and painstakingly export every single method (or use COM+ to cross the 32-bit / 64-bit barrier), which would have taken me a week since we are talking a large body of quality code. With Hydra i was able to wrap the whole thing in less than an hour.

I’m not advocating that people stop updating their components. But I am very thankful for the opportunity to delay having to update my entire component stack just to enjoy a modern version of Delphi.

Hydra gives me that opportunity, which means I can upgrade when my wallet allows it.

Building better applications

There is also another side to Hydra, namely that it allows you to design applications in a modular way. If you have the luxury of starting a brand new project and use Hydra from day one, you can isolate each part of your application as a module. Avoiding the trap of monolithic applications.

img_517046

Hydra for .Net allows you to use Delphi, Java and FPC modules under C#

This way of working has great impact on how you maintain your software, and consequently how you issue hotfixes and updates. If you have isolated each key part of your application as separate modules, you don’t need to ship a full build every time.

This also safeguards you from having all your eggs in one basket. If you have isolated each form (for example) as separate modules, there is nothing stopping you from rewriting some of these forms in another language – or cross the VCL and FMX barrier. You have to admit that being able to use the latest components from Developer Express is pretty cool. There is not a shadow of a doubt that Developer-Express makes the best damn components around for any platform. There are many grids for Delphi, but they cant hold a candle to the latest and greatest from Developer Express.

Why can’t I just use packages?

If you are thinking “hey, this sounds exactly like packages, why should I buy Hydra when packages does the exact same thing?“. Actually that’s not how packages work for Delphi.

Delphi packages are cool, but they are also severely limited. One of the reasons you have to update your components whenever you buy a newer version of Delphi, is because packages are not backwards compatible.

delphi-500

Delphi packages are great, but severely limited

A Delphi package must be compiled with the same RTL as the host (your program), and version information and RTTI must match. This is because packages use the same RTL and more importantly, the same memory manager.

Hydra modules are not packages. They are clean and lean library files (*.dll files) that includes whatever RTL you compiled them with. In other words, you can safely load a Hydra module compiled with Delphi 7, into a Delphi 10.3 application without having to re-compile.

Once you start to work with Hydra, you gradually build up modules of functionality that you can recycle in the future. In many ways Hydra is a whole new take on components and RAD. This is how Delphi packages and libraries should have been.

Without saying anything bad about Delphi, because Delphi is a system that I love very much; but having to update your entire component stack just to use the latest Delphi, is sadly one of the factors that have led developers to abandon the platform. If you have USD 10.000 in dependencies, having to pay that as well as buying Delphi can be difficult to justify; especially when comparing with other languages and ecosystems.

For me, Hydra has been a tremendous boon for Delphi. It has allowed me to keep current with Delphi and all it’s many new features, without losing the money I have already invested in components packages.

If you are looking for something to bring your product to the next level, then I urge you to spend a few hours with Hydra. The documentation is exceptional, the features and benefits are outstanding — and you will wonder how you ever managed to work without them.

External resources

Disclaimer: I am not a salesman by any stretch of the imagination. I realize that promoting a product made by the company you work for might come across as a sales pitch; but that’s just it: I started to work for RemObjects for a reason. And that reason is that I have used their products since they came on the market. I have worked with these components long before I started working at RemObjects.

.NetRocks, you made my day!

October 11, 2019 4 comments

72462670_10156562141710906_5626655686042583040_nA popular website for .Net developers is called dot-net-rocks. This is an interesting site that has been going for a while now; well worth the visit if you do work with the .Net framework via RemObjects Elements, VS or Mono.

Now it turns out that the guys over at dot–net-rocks just did an episode on their podcast where they open by labeling me as a “raving lunatic” (I clearly have my moments); which I find absolutely hilarious, but not for the same reasons as them.

Long story short: They are doing a podcast on how to migrate legacy Delphi applications to C#, and in that context they somehow¬†tracked down an article I posted way back in 2016, which was meant as a satire piece. Now don’t get me wrong, there are serious points in the article, like how the .Net framework was modeled on the Delphi VCL, and how the concepts around CLR and JIT were researched at Borland; but the tone of the whole thing, the “larger than life” claims etc. was meant to demonstrate just how some .Net developers behave when faced with alternative eco-systems. Having managed some 16+ usergroups for Delphi, C#, JavaScript (a total of six languages) on Facebook for close to 15 years, as well as working for Embarcadero that makes Delphi -I speak from experience.

It might be news to these guys that large companies around Europe is still using Delphi, modern Delphi, and that Object Pascal as a language scores well on the Tiobi index of popular programming languages. And no amount of echo-chamber mentality is going to change that fact. Heck, as late as 2018 and The Walt Disney Company wanted to replace C# with Delphi, because it turns out that bytecodes and embedded tech is not the best combination (cpu spikes when the GC kicks in, no real-time interrupt handling possible, GPIO delays, the list goes on).

I mean, the post i made back in 2016 is such obvious, low-hanging fruit for a show their size to pound on. You have this massive show that takes on a single, albeit ranting (and probably a bit of a lunatic if I don’t get my coffee) coder’s post. Underlying in the process how little they know about the object pascal community at large. They just demonstrated my point in bold, italic and underline ūüėÄ

Look before you shoot

DotNetRocks is either oblivious that Delphi still have millions of users around the world, or that Pascal is in fact available for .Net (which is a bit worrying since .Net is supposed to be their game). The alternative is that the facts I listed hit a little too close to home. I’ll leave it up to the reader to decide. Microsoft has lost at least 10 Universities around Europe to Delphi in 2018 that I know of, two of them Norwegian where I was personally involved in the license sales. While only speculation, I do find the timing for their podcast and focus on me in particular to be, “curious”.

72704588_10156562141590906_7030064639744409600_nAnd for the record, the most obvious solution when faced with “that legacy Delphi project”, is to just go and buy a modern version of Delphi. DotNetRocks delivered a perfect example of that very arrogance my 2016 post was designed to convey; namely that “brogrammers” often act like Delphi 7 was the last Delphi. They also resorted to lies to sell their points: I never said that Anders was dogged for creating Delphi. Quite the opposite. I simply underlined that by ridiculing Delphi in one hand, and praising it’s author with the other – you are indirectly (and paradoxically) invalidating half his career. Anders is an awesome developer, but why exclude how he evolved his skills? Ofcourse Ander’s products will have his architectural signature on them.

Not once did they mention Embarcadero or the fact that Delphi has been aggressively developed since Borland kicked the bucket. Probably hoping that undermining the messenger will somehow invalidate the message.

vspas

Porting Delphi to C# manually? Ok.. why not install Elements and just compile it into an assembly? You don’t even have to leave Visual Studio

Also, such an odd podcast for professional developers to run with. I mean, who the hell converts a Delphi project to C# manually? It’s like listening to a graphics artist that dont know that Photoshop and Illustrator are the de-facto tools to use. How is that even possible? A website dedicated to .Net, yet with no insight into the languages that run on the CLR? Wow.

If you want to port something from Delphi to .Net, you don’t sit down and manually convert stuff. You use proper tools like Elements from RemObjects; This gives you Object-Pascal for .Net (so a lot of code will compile just fine with only minor changes). Elements also ships with source-conversion tools, so once you have it running under Oxygene Pascal (the dialect is called Oxygene) you either just use the assemblies — or convert the Pascal code to C# through a tool called an Oxidizer.

vsdelphi

The most obvious solution is to just upgrade to a Delphi version from this century

The other solution is to use Hydra, also a RemObjects product. They can then compile the Delphi code into a library (including visual parts like forms and frames), and simply use that as any other assembly from within C#. This allows you to gradually phase out older parts without breaking the product. You can also use C# assemblies from Delphi with Hydra.

So by all means, call me what you like. You have only proved my point so far. You clearly have zero insight into the predominant Object-Pascal eco-systems, you clearly don’t know the tools developers use to interop between arcetypical and contextual languages — and instead of fact checking some of the points I made, dry humor notwithstanding, you just reacted like brogrammers do.

Well, It’s been weeks since I laughed this hard ūüėÄ You really need to check before you pick someone to verbally abuse on the first date, because you might just bite yourself in the arse here he he

Cheers

 

RemObjects Elements + ODroid N2 = true

August 7, 2019 Leave a comment

Since the release of Raspberry PI back in 2012 the IOT and Embedded market has exploded. The price of the PI SBC (single board computer) enabled ordinary people without any engineering background to create their own software and hardware projects; and with that the IOT revolution was born.

Almost immediately after the PI became a success, other vendors wanted a piece of the pie (pun intended), and an avalanche of alternative mini computers started surfacing in vast quantities. Yet very few of these so-called “pi killers” actually stood a chance. The power of the Raspberry PI is not just price, it’s actually the ecosystem around the product. All those shops selling electronic parts that you can use in your IOT projects for example.

55468436_2018717198423856_993746185506258944_n

The ODroid N2, one of the fastest SBCs in it’s class

The ODroid family of single-board computers stands out as unique in this respect. Where other boards have come and gone, the ODroid family of boards have remained stable, popular and excellent alternatives to the Raspberry PI. Hardkernel, the maker of Odroid boards and its many peripherals, are not looking for a “quick buck” like others have. Instead they have slowly and steadily perfected their hardware,¬† software, and seeded a great community.

ODroid is very popular at RemObjects, and when we added 64-bit ARM Linux support a couple of weeks back, it was the ODroid N2 board we used for testing. It has been a smooth ride all the way.

ODroid

As I am typing this, a collection of ODroid XU4s is humming away inside a small, desktop cluster I have built. This cluster is made up of 5 x ODroid XU4 boards, with an additional ODroid N2 acting as the head (the board that controls the rest via the network).

67582488_10156396548830906_5204248427029856256_o

My ODroid Cluster in all its glory

Prior to picking ODroid for my own projects, I took the time to test the most popular boards on the market. I think I went through eight or ten models, but none of the other were even close to the quality of ODroid. It’s very easy to confuse aggressive marketing with quality. You can have the coolest hardware in the world, but if it lacks proper drivers and a solid Linux distribution, it’s for all means and purposes a waste of time.

Since IOT is something that i find exciting on a personal level, being able to target 64-bit ARM Linux has topped my wish-list for quite some time. So when our compiler wizard Carlo Kok wanted to implement support for 64-bit ARM-Linux, I was thrilled!

We used the ODroid N2 throughout the testing phase, and the whole process was very smooth. It took Carlo roughly 3 days to add support for 64-bit ARM Linux and it hit our main channel within a week.

I must stress that while ODroid N2 is one of our verified SBCs, the code is not explicitly about ODroid. You can target any 64-bit ARM SBC providing you use a Debian based Linux (Ubuntu, Mint etc). I tested the same code on the NanoPI board and it ran on the first try.

Why is this important?

The whole point of the Elements compiler toolchain, is not just to provide alternative compilers; it’s also to ensure that the languages we support become first class citizens, side by side with other archetypical languages. For example, if all you know is C# or Java, writing kernel drivers has been our of limits. If you are operating with traditional Java or .Net, you have to use a native bridge (like the service host under Windows). Your only other option was to code that particular piece in traditional C.

Water-Weather-tvOS@2x

With Elements you can pick whatever language you know and target everything

With Elements that is no longer the case, because our compilers generates llvm optimized machine-code; code that in terms of speed, access and power stand side by side with C/C++. You can even import C/C++ header files and work directly with the existing infrastructure. There is no middleware, no service host, no bytecodes and no compromise.

Obviously you can compile to bytecodes too if you like (or WebAssembly), but there are caveats to watch out for when using bytecodes on SBCs. The garbage collector can make or break your product, because when it kicks in -it causes CPU spikes. This is where Elements step-up and delivers true native compilation. For all supported targets.

More boards to come

This is just my personal blog, so for the full overview of boards I am testing there will be a proper article on our official RemObjects blog-space. Naturally I can’t test every single board on the market, but I have around 10 different models which covers the common boards used by IOT and embedded projects.

But for now at least, you can check off the ODroid N2 (64-bit) and NanoPI-Fire 2 (32-bit)

Check out RemObjects Remoting SDK

July 22, 2019 3 comments

RemObjects Remoting SDK is one of those component packages that have become more than the sum of it’s part. Just like project Jedi has become standard equipment almost, Remoting SDK is a system that all Delphi and Freepascal developers should have in their toolbox.

ro_logo
In this article I’m going to present the SDK in broad strokes; from a viewpoint of someone who haven’t used the SDK before. There are still a large number of Delphi developers that don’t know it even exists – hopefully this post will shed some light on why the system is worth every penny and what it can do for you.

I should also add, that this is a personal blog. This is not an official RemObjects presentation, but a piece written by me based on my subjective experience and notions. We have a lot of running dialog at Delphi Developer on Facebook, so if I read overly harsh on a subject, that is my personal view as a Delphi Developer.

Stop re-inventing the wheel

Delphi has always been a great tool for writing system services. It has accumulated a vast ecosystem of non-visual components over the years, both commercial and non-commercial, and this allows developers to quickly aggregate and expose complex behavior — everything from graphics processing to databases, file processing to networking.

The challenge for Delphi is that writing large composite systems, where you have more than a single service doing work in concert, is not factored into the RTL or project type. Delphi provides a bare-bone project type for system services, and that’s it. Depending on how you look at it, it’s either a blessing or a curse. You essentially start on C level.

So fundamental things like IPC (inter process communication) is something you have to deal with yourself. If you want multi-tenancy that is likewise not supported out of the box. And all of this is before we venture into protocol standards, message formats and async vs synchronous execution.

The idea behind Remoting SDK is to get away from this style of low-level hacking. Without sounding negative, it provides the missing pieces that Delphi lacks, including the stuff that C# developers enjoy under .net (and then some). So if you are a Delphi developer who look over at C# with smudge of envy, then you are going to love Remoting SDK.

Say goodbye to boilerplate mistakes

Writing distributed servers and services is boring work. For each function you expose, you have to define the parameters and data-types in a portable way, then you have to implement the code that represents the exposed function and finally the interface itself that can be consumed by clients. The latter must be defined in a way that works with other languages too, not just Delphi. So while server tech in it’s essential form is quite simple, it’s the infrastructure that sets the stage of how quickly you can apply improvements and adapt to change.

For example, let’s say you have implemented a wonderful new service. It exposes 60 awesome functions that your customers can consume in their own work. The amount of boilerplate code for 60 distributed functions, especially if you operate with composite data types, is horrendous. It is a nightmare to manage and opens up for sloppy, unnecessary mistakes.

ide_int

After you install Remoting SDK, the service designer becomes a part of the IDE

This is where Remoting SDK truly shines. When you install the software, it integrates it’s editors and wizards closely with the Delphi IDE. It adds a ton of new project types, components and whatnot – but the most important feature is without a doubt the service designer.

bonjour

Start the service-designer in any server or service project and you can edit the methods, data types and interfaces your system expose to the world

As the name implies, the service designer allows you to visually define your services. Adding a new function is a simple click, the same goes for datatypes and structures (record types). These datatypes are exposed too and can be consumed from any modern language. So a service you make in Delphi can be used from C#, C/C++, Java, Oxygene, Swift (and visa-versa).

Auto generated code

A service designer is all good and well I hear you say, but what about that boilerplate code? Well Remoting SDK takes care of that too (kinda the point). Whenever you edit your services, the designer will auto-generate a new interface unit for you. This contains the classes and definitions that describe your service. It will also generate an implementation unit, with empty functions; you just need to fill in the blanks.

The designer is also smart enough not to remove code. So if you go in and change something, it won’t just delete the older implementation procedure. Only the params and names will be changed if you have already written some code.

bonjour_source

Having changed a service, hitting F9 re-generates the interface code automatically. Your only job is to fill in the code for each method in the implementation units. The SDK takes care of everything else for you

The service information, including the type information, is stored in a special file format called “rodl”. This format is very close to Microsoft WSDL format, but it holds more information. It’s important to underline that you can import the service directly from your servers (optional naturally) as WSDL. So if you want to consume a Remoting SDK service using Delphi’s ordinary RIO components, that is not a problem. Visual Studio likewise imports and consumes services – so Remoting SDK behaves identical regardless of platform or language used.

Remoting SDK is not just for Delphi, just to be clear on that. If you are presently using both Delphi and C# (which is a common situation), you can buy a license for both C# and Delphi and use whatever language you feel is best for a particular task or service. You can even get Remoting SDK for Javascript and call your service-stack directly from your website if you like. So there are a lot of options for leveraging the technology.

Transport is not content

OK so Remoting SDK makes it easy to define distributed services and servers. But what about communication? Are we boxed into RemObjects way of doing things?

The remoting framework comes with a ton of components, divided into 3 primary groups:

  • Servers
  • Channels (clients)
  • Messages

The reason for this distinction is simple: the ability to transport data, is never the same as the ability to describe data. For example, a message is always connected to a standard. It’s job is ultimately to serialize (represent) and de-serialize data according to a format. The server’s job is to receive a request and send a response. So these concepts are neatly decoupled for maximum agility.

As of writing the SDK offers the following message formats:

  • Binary
  • Post
  • SOAP
  • JSON

If you are exposing a service that will be consumed from JavaScript, throwing in a TROJSONMessage component is the way to go. If you expect messages to be posted from your website using ordinary web forms, then TROPostMessage is a perfect match. If you want XML then TROSOAPMessage rocks, and if you want fast, binary messages – well then there is TROBinaryMessage.

What you must understand is that you don’t have to pick just one! You can drop all 4 of these message formats and hook them up to your server or channel. The SDK is smart enough to recognize the format and use the correct component for serialization. So creating a distributed service that can be consumed from all major platforms is a matter of dropping components and setting a property.

channels

If you double-click on a server or channel, you can link message components with a simple click. No messy code snippets in sight.

Multi-tenancy out of the box

With the release of Rad-Server as a part of Delphi, people have started to ask what exactly multi-tenancy is and why it matters. I have to be honest and say that yes, it does matter if you are creating a service stack where you want to isolate the logic for each customer in compartments – but the idea that this is somehow new or unique is not the case. Remoting SDK have given users multi-tenancy support for 15+ years, which is also why I haven’t been too enthusiastic with Rad-Server.

Now don’t get me wrong, I don’t have an axe to grind with Rad-Server. The only reason I mention it is because people have asked how i feel about it. The tech itself is absolutely welcome, but it’s the licensing and throwing Interbase in there that rubs me the wrong way. If it could run on SQLite3 and was free with Enterprise I would have felt different about it.

mt-models

There are various models for multi-tenancy, but they revolve around the same principles

To get back on topic: multi-tenancy means that you can dynamically load services and expose them on demand. You can look at it as a form of plugin functionality. The idea in Rad-Server is that you can isolate a customer’s service in a separate package – and then load the package into your server whenever you need it.

ro_comps

Some of the components that ship with the system

The reason I dislike Rad-Server in this respect, is because they force you to compile with packages. So if you want to write a Rad-Server system, you have to compile your entire project as package-based, and ship a ton of .dpk files with your system. Packages is not wrong or bad per-se, but they open your system up on a fundamental level. There is nothing stopping a customer from rolling his own spoof package and potentially bypass your security.

There is also an issue with un-loading a package, where right now the package remains in memory. This means that hot-swapping packages without killing the server wont work.

Rad-Server is also hardcoded to use Interbase, which suddenly bring in licensing issues that rubs people the wrong way. Considering the price of Delphi in 2019, Rad-Server stands out as a bit of an oddity. And hardcoding a database into it, with the licensing issues that brings -just rendered the whole system mute for me. Why should I pay more to get less? Especially when I have been using multi-tenancy with RemObjects for some 15 years?

With Remoting SDK you have something called DLL servers, which does the exact same thing – but using ordinary DLL files (not packages!). You don’t have to compile your system with packages, and it takes just one line of code to make your main dispatcher aware of the loaded service.

This actually works so well that I use Remoting SDK as my primary “plugin” system. Even when I write ordinary desktop applications that has nothing to do with servers or services – I always try to compartmentalize features that could be replaced in the future.

For example, I’m a huge fan of ElevateDB, which is a native Delphi database engine that compiles directly into your executable. By isolating that inside a DLL as a service, my application is now engine agnostic – and I get a break from buying a truck load of components every time Delphi is updated.

Saving money

The thing about DLL services, is that you can save a lot of money. I’m actually using an ElevateDB license that was for Delphi 2007. I compiled the engine using D2007 into a DLL service — and then I consume that DLL from my more modern Delphi editions. I have no problem supporting or paying for components, that is right and fair, but having to buy new licenses for every single component each time Delphi is updated? This is unheard of in other languages, and I would rather ditch the platform all together than forking out $10k ever time I update.

dll_project

A DLL server can be used for many things if you are creative about it

While we are on the subject – Hydra is another great money saver. It allows you to use .net and Java libraries (both visual and non-visual) with Delphi. With Hydra you can design something in .net, compile it into a DLL file, and then use that from Delphi.

But — you can also compile things from Delphi, and use it in newer versions of Delphi. Im not forking out for a Developer Express update just to use what I have already paid for in the latest Delphi. I have one license, I compile the forms and components into a Hydra Module — and then use it from newer Delphi editions.

hydra

Hydra, which is a separate product, allows you to stuff visual components and forms inside a vanilla DLL. It allows cross  language use, so you can finally use Java and .net components inside your Delphi application

Bonjour support

Another feature I love is the zero configuration support. This is one of those things that you often forget, but that suddenly becomes important once you deploy a service stack on cluster level.

apple_bonjour_medium-e1485166557218Remoting SDK comes with support for Apple Bonjour, so if you want to use that functionality you have to install the Bonjour library from Apple. Once installed on your host machines, your RemObjects services can find each other.

ZeroConfig is not that hard to code manually. You can roll your own using UDP or vanilla messages. But getting service discovery right can be fiddly. One thing is broadcasting an UDP message saying “here I am”, it’s something else entirely to allow service discovery on cluster level.

If Bonjour is not your cup of tea, the SDK provides a second option, which is RemObjects own zero-config hub. You can dig into the documentation to find out more about this.

What about that IPC stuff you mentioned?

I mentioned IPC (inter process communication) at the beginning here, which is a must have if you are making a service stack where each member is expected to talk to the others. In a large server-system the services might not exist on the same, physical hardware either, so you want to take height for that.

With the SDK this is just another service. It takes 10 minutes to create a DLL server with the functionality to send and receive messages – and then you just load and plug that into all your services. Done. Finished.

Interestingly, Remoting SDK supports named-pipes. So if you are running on a Windows network it’s even easier. Personally I prefer to use a vanilla TCP/IP based server and channel, that way I can make use of my Linux blades too.

Building on the system

There is nothing stopping you from expanding the system that RemObjects have established. You are not forced to only use their server types, message types and class framework. You can mix and match as you see fit – and also inherit out your own variation if you need something special.

firm_foundation-720x340For example, WebSocket is an emerging standard that has become wildly popular. Remoting SDK does not support that out of the box, the reason is that the standard is practically identical to the RemObjects super-server, and partly because there must be room for third party vendors.

Andre Mussche took the time to implement a WebSocket server for Remoting SDK a few years back. Demonstrating in the process just how easy it is to build on the existing infrastructure. If you are already using Remoting SDK or want WebSocket support, head over to his github repository and grab the code there: https://github.com/andremussche/DelphiWebsockets

I could probably write a whole book covering this framework. For the past 15 years, RemObjects Remoting SDK is the first product I install after Delphi. It has become standard for me and remains an integral part of my toolkit. Other packages have come and gone, but this one remains.

Hopefully this post has tickled your interest in the product. No matter if you are maintaining a legacy service stack, or thinking about re implementing your existing system in something future-proof, this framework will make your life much, much easier. And it wont break the bank either.

You can visit the product page here: https://www.remotingsdk.com/ro/default.aspx

And you can check out the documentation here: https://docs.remotingsdk.com/

RemObjects Remoting SDK?

June 3, 2019 Leave a comment

Reading this you could be forgiven for thinking that I must promote RemObjects products, It’s my job now right? Well yes, but also no.

dataabstract-illustration-rework-ro-1100The thing is, I’m really not “traveling salesman” material by any stretch of the imagination. My tolerance for bullshit is ridiculously low, and being practical of nature I loath fancy products that cost a fortune yet deliver nothing but superficial fluff.

The reasons I went to work at RemObjects are many, but most of all it’s because I have been an avid supporter of their products since they launched. I have used and seen their products in action under intense pressure, and I have come to put some faith in their solutions.

Trying to describe what it’s like to write servers that should handle thousands of active user “with or without” RemObjects Remoting SDK is exhausting, because you end up sounding like a fanatic. Having said that, I feel comfortable talking about the products because I speak from experience.

I will try to outline some of the benefits here, but you really should check it out yourself. You can download a trial directly here: https://www.remotingsdk.com/ro/

Remoting framework, what’s that?

RemObjects Remoting framework (or “RemObjects SDK” as it was called earlier) is a framework for writing large-scale RPC (remote procedure call) servers and services. Unlike the typical solutions available for Delphi and C++ builder, including those from Embarcadero I might add, RemObjects¬†framework stands out because it distinguishes between transport, host and message-format – and above all, it’s sheer quality and ease of use.

compo

RemObjects Remoting SDK ships with a rich selection of channels and message formats

This separation between transport, host and message-format makes a lot of sense, because the parameters and data involved in calling a server-method, shouldn’t really be affected by how it got there.

And this is where the fun begins because the framework offers you a great deal of different server types (channels) and you can put together some interesting combinations by just dragging and dropping components.

How about JSON over email? Or XML over pipes?

The whole idea here is that you don’t have to just work with one standard (and pay through the nose for the privilege). You can mix and match from a rich palette of transport mediums and message-formats and instead focus on your job; to deliver a kick-ass product.

And should you need something special that isn’t covered by the existing components, inheriting out your own channel or message classes is likewise a breeze. For example, Andre Mussche have some additional components on GitHub that adds a WebSocket server and client. So there is a lot of room for expanding and building on the foundation provided by RemObjects.

And this is where RemObjects has the biggest edge (imho), namely that their solutions shaves weeks if not months off your development time. And the central aspect of that is their integrated service designer.

Integration into the Delphi IDE

Dropping components on a form is all good and well, but the moment you start coding services that deploy complex data-types (records or structures) the amount of boilerplate code can become overwhelming.

The whole point of a remoting framework is that it should expose your services to the world. Someone working in .net or Java on the other side of the planet should be able to connect, consume and invoke your services. And for that to happen every minute detail of your service has to follow standards.

61855462_10156255129755906_1396051777802993664_o

The RemObjects Service Builder integrates directly into the Delphi IDE

When you install RemObjects SDK, it also integrates into the Delphi IDE. And one of the features it integrates is a complete, separate service designer. The designer can also be used outside of the Delphi IDE, but I cannot underline enough how handy it is to be able to design your services visually, right there and then, in the Delphi IDE.

This designer doesn’t just help you design your service description (RemObjects has their own RODL file-format, which is a bit like a Microsoft WSDL file),¬†the core purpose is to auto-generate all the boilerplate code for you — directly into your Delphi project (!)

So instead of you having to spend a week typing boilerplate code for your killer solution, you get to focus on implementing the actual methods (which is what you are supposed to be doing in the first place).

DLL services, code re-use and multi-tenancy

The idea of multi-tenancy is an interesting one. One that I talked about with regards to Rad-Server both in Oslo and London before christmas. But Rad-Server is not the only system that allows for multi-tenancy. I was doing multi-tenancy with RemObjects SDK some 14 years ago (if not earlier).

Remember how I said the framework distinguishes between transport, message and host? That last bit, namely host, is going to change how you write applications.

When you install the framework, it registers a series of custom project types inside the Delphi IDE. So if you want to create a brand new RemObjects SDK server project, you can just do that via the ordinary File->New->Other menu option.

One of the project types is called a DLL Server. Which literally means you get to isolate a whole service library inside a single DLL file! You can then load in this DLL file and call the functions from other projects. And that is, ultimately, the fundamental principle for multi-tenancy.

And no, you don’t have to compile your project with external packages for this to work. The term “dll-server” can also be a bit confusing, because we are not compiling a network server into a DLL file, we are placing the code for a service into a DLL file. I used this project type to isolate common code, so I wouldn’t have to copy unit-files all over the place when delivering the same functionality.

It’s also a great way to save money. Don’t want to pay for that new upgrade? Happy with the database components you have? Isolate them in a DLL-Server and continue to use the code from your new Delphi edition. I have Delphi XE3 Database components running inside a RemObjects DLL-Server that I use from Delphi XE 10.3.

project_types

DLL server is awesome and elegantly solves real-life problems out of the box

In my example I was doing business-logic for our biggest customers. Each of them used the same database, but they way they registered data was different. The company I worked for had bought up these projects (and thus their customers with them), and in order to keep the customers happy we couldn’t force them to re-code their systems to match ours. So we had to come up with a way to upgrade our technology without forcing a change on them.

The first thing I did was to create a “DLL server” that dealt with the database. It exposed methods like openTable(), createInvoice(), getInvoiceById() and so on. All the functions I would need to work with the data without getting my fingers dirty with SQL outside the DLL. So all the nitty gritty of SQL components, queries and whatnot was neatly isolated in that DLL file.

I then created separate DLL-Server projects for each customer, implemented their service interfaces identical to their older API. These DLL’s directly referenced the database library for authentication and doing the actual work.

62174838_10156255134895906_9195165500563259392_n

When integrated with the IDE, you are greeted with a nice welcome window when you start Delphi. Here you can open examples or check out the documentation

Finally, I wrapped it all up in a traditional Windows system service, which contained two different server-channels and the message formats they needed. When the service was started it would simply load in the DLL’s and manually register their services and types with the central channel — and voila, it worked like a charm!

Rock solid

Some 10 years after I delivered the RemObjects based solution outlined above, I got a call from my old employer. They had been victim of a devastating cyber attack. I got a bit anxious as he went on and on about damages and costs, fearing that I had somehow contributed to the situation.

targets

But it turned out he called to congratulate me! Out of all the services in their server-park, mine were the only ones left standing when the dust settled.

The RemObjects payload balancer had correctly dealt with both DDOS and brute force attacks, and the hackers were left wanting at the gates.

LDef parser done

July 21, 2017 Leave a comment

Note: For a quick introduction to LDef click here: Introduction to LDef.

Great news guys! I finally finished the parser and model builder for LDef!

02237439ec5958f6ec7362f726a94696-cogwheels-red-circle-icon-by-vexelsThat means we just need to get the assembler ported. This is presently running fine under Smart Pascal (I like to prototype things there since its faster) – and it will be easy to port it over to Delphi and Freepascal after the model has gone through the steps.

I’m really excited about this project and while I sadly don’t have much free time – this is a project I truly enjoy working on. Perhaps not as much as Smart Pascal which is my baby, but still; its turning into a fantastic system.

Thoughts on the architecture

One of the things I added support for, and that I have hoped that Embarcadero would add to Delphi for a number of years now, is support for contract coding. This is a huge topic that I’m not jumping into here, but one of the features it requires is support for entry and exit sections. Essentially that you can define code that executes before the method body and directly after it has finished (before the result is returned if it’s a function).

This opens up for some very clever means of preventing errors, or at the very least give the user better information about what went wrong. Automated tests also benefits greatly from this.

For example,  a normal object pascal method looks, for example, like this:

procedure TForm1.MySpecialMethod;
begin
  writeln("You called my-special-method")
end;

The basis of contract design builds on the classical and expands it as such:

procedure TForm1.MySpecialMethod;
  Before()
  begin
    writeln("Before my-special-method");
  end;

  After()
  begin
    writeln("After my-special-method");
  end;

begin
  writeln("You called my-special-method")
end;

Note: contract design is a huge system and this is just a fragment of the full infrastructure.

What is cool about the before/after snippets, is that they allow you to verify parameters before the body is even executed, and likewise you get to work on the result before the value is returned (if any).

You mights ask, why not just write the tests directly like people do all the time? Well, that is true. But there will also be methods that you have no control over, like a wrapper method that calls a system library for instance. Being able to attach before/after code for externally defined procedures helps take the edge off error testing.

Secondly, if you are writing a remoting framework where variant data and multi-threaded invocation is involved – being able to check things as they are dispatched means catching potential errors faster – leading to better performance.

As always, coding techniques is a source of argument – so im not going into this now. I have added support for it and if people don’t need it then fine, just leave it be.

Under LDef assembly it looks like this:

public void main() {
  enter {
  }

  leave {
  }
}

Well I guess that’s all for now. Hopefully my next LDef post will be about the assembler being ready – leaving just the linker. I need to experiment a bit with the codegen and linker before the unit format is complete.

The bytecode-format needs to include enough information so that the linker can glue things together. So every class, member, field etc. must be emitted in a way that is easy and allows the linker to quickly look things up. It also needs to write the actual, resulting method offsets into the bytecode.

Have a happy weekend!

Smart-Pascal: A brave new world, 2022 is here

April 29, 2017 6 comments

Trying to explain what Smart Mobile Studio does and the impact it can have on your development cycle is very hard. The market is rampant with superficial frameworks that promises you the world, and investors have been taken for a ride by hyped up, one-click “app makers” more than once.

I can imagine that being an investor is a bit like panning for gold. Things that glitter the most often turn out to be worthless – yet fortunes may hide beneath unpolished and rugged surfaces.

Software will disrupt most traditional industries in the next 5-10 years.
Uber is just a software tool, they don’t own any cars, yet they are now the
biggest taxi company in the world. -Source: R.M.Goldman, Ph.d

So I had enough. Instead of trying to tell people what I can do, I decided I’m going to show them instead. As the american’s say: “talk is cheap”. And a working demonstration is worth a thousand words.

Care to back that up with something?

A couple of weeks ago I published a video on YouTube of our Smart Pascal based desktop booting up in VMWare. The Amiga forums went off the chart!

vmware

For those that havent followed my blog or know nothing about the desktop I’m talking about, here is a short¬†summary of the events so far:


Smart Mobile Studio is a compiler that takes pascal, like that made popular in Delphi or Lazarus, and compiles it JavaScript instead of machine-code.

This product has shipped with an example of a desktop for years (called “Quartex media desktop”). It was intended as an example of how you could write a front-end for kiosk machines and embedded devices. Systems that could use a touch screen as the interface between customer and software.

You have probably seen those info booths in museums, universities and libraries? Or the ticket machines in subways, train-stations or even your local car-wash? All of those are embedded systems. And up until recently these have been small and expensive computers for running Windows applications in full-screen. Applications which in turn talk to a server or local database.

Smart Mobile Studio is able to deliver the exact same (and more) for a fraction of the price. A company in Oslo replaced their $300 per-board unit – with off the shelves $35 Raspberry Pi mini-computers. They then used Smart Pascal to write their client software and ran it in a fullscreen-browser. The Linux distribution was changed to boot straight into Firefox in full-screen. No Linux desktop, just a web display.

The result? They were able to cut production cost by $265 per unit.


Right, back to the desktop. I mentioned the Amiga community. This is a community of coders and gamers that grew up with the old Commodore machines back in the 80s and 90s. A new Amiga is now on the way (just took 20+ years) – and the look and feel of the new operating-system, Amiga OS 4.1, is the look and feel I have used in The Smart Desktop environment. First of all because I grew up on these machines myself, and secondly because the architecture of that system was extremely cost-effective. We are talking about a system that delivered pre-emptive multitasking in as little as 512Kb of memory (!). So this is my “ode to OS 4” if you will.

And the desktop has caused quite a stir both in the Delphi community, cloud community and retro community alike. Why? Because it shows some of the potential cloud technology can give you. Potential that has been under their nose all this time.

And even more important: it demonstrate how productive you can be in Smart Pascal. The operating system itself, both visual and non-visual parts, was put together in my spare time over 3 weeks. Had I been able to work on it daily (as a normal job) I would have knocked it out in a week.

A desktop as a project type

All programming languages have project types. If you open up Delphi and click “new” you are greeted with a rich menu of different projects you can make. From low-level DLL files to desktop applications or database servers. Delphi has it all.

delphistuff

Delphi offers a wide range of projects types you can create

The same is true for visual studio. You click “new solution” and can pick from a wide range of different projects. Web projects, servers, desktop applications and services.

Smart Pascal is the only system where you click “new project” and there is a type called “Smart desktop” and “Smart desktop application”. In other words, the power to create a full desktop is now an integrated part of Smart Pascal.

And the desktop is unique to you. You get to shape it, brand it and make it your own!

Let us take a practical example

Imagine a developer given the task to move the company’s aging invoice and credit system from the Windows desktop – to a purely web-based environment.

legacy2The application itself is large and complex, littered with legacy code and “quick fixes” going back decades. Updating such a project is itself a monumental task – but having to first implement concepts like what a window is, tasks, user space, cloud storage, security endpoints, look and feel, back-end services and database connectivity; all of that before you even begin porting the invoice system itself ? The cost is astronomical.

And it happens every single day!

In Smart Pascal, the same developer would begin by clicking “new project” and selecting “Smart desktop”. This gives him a complete desktop environment that is unique to his project and company.

A desktop that he or she can shape, adjust, alter and adapt according to the need of his employer.¬†Things like file-type recognition, storage, getting that database ‚Äď all of these things are taken care of already. The developer can focus on his task, namely to deliver a modern implementation of their invoice and credit¬†software ‚Äď not waste months¬†trying to force JavaScript frameworks do things they simply lack the depth to deliver.

Once the desktop has the look and feel in order, he would have to make a simple choice:

  • Should the whole desktop represent the invoice system or ..
  • Should the invoice system be implemented as a secondary application running on the desktop?

If it’s a large and dedicated system where the users have no need for other programs running, then implementing the invoice system inside the desktop itself is the way to go.

If however the customer would like to expand the system later, perhaps add team management, third-party web-services or open-office like productivity (a unified intranet if you like) – then the second option makes more sense.

On the brink of a revolution

The developer of 2022 is not limited to the desktop. He is not restricted to a particular operating system or chip-set. Fact is, cloud has already reduced these to a matter of preference. There is no strategic advantage of using Windows over Linux when it comes to cloud software.

Where a traditional developer write and implement a solution for a particular system (for instance Microsoft Windows, Apple OS X or Linux) ‚Äď cloud developers deliver whole eco systems; constellations of software constructed from many parts, both micro-services developed in-house but also services from¬†others; like Amazon or Azure.

All these parts co-operate and can be combined through established end-point standards, much like how components are used in Delphi or Visual Studio today.

smartdesk

The Smart Desktop, codename “Amibian.js”

Access to products written in Smart is through the browser, or sometimes through a ‚Äúpaper thin‚ÄĚ native host¬†(Cordova Phonegap,¬†Delphi and C/C++) that expose system level functionality.¬†These hosts wrap your application in a native, executable container ready for Appstore or Google Play.

Now the visual content is typically the same, and is only adapted for a particular device. The real work is divided between the client (which is now very much capable) and your server back-end.

So people still write code in 2022, but the software behaves differently and is designed to function as a group (cluster). And this requires a shift in the way we think.

asmjs

Above: One of my asm.js prototype compilers. Lets just say it runs fast!

Scaling a solution from processing 100 invoices a minute to handling 100.000 invoices a minute – is no longer a matter of code, but of architecture. This is where the traditional, native only approach to software comes up short, while more flexible approaches like node.js is infinitely more capable.

What has emerged up until now is just the tip of the ice-berg.

Over the next five to eight years, everything is going to change. And the changes will be irrevocable and permanent.

Running your Smart Pascal server as a system-level daemon is very easy once you know what to look for :)

The Smart Desktop back-end running as a system service on a Raspberry PI

As the Americans say, talk is cheap – and I’m done talking. I will do this with you, or without you. Either way it’s happening.

Nightly-build of the desktop can be tested here: http://quartexhq.myasustor.com/

Delphi developer on its own server

April 4, 2017 Leave a comment

While the Facebook group will naturally continue exactly like it has these past years, we have set up a server active Delphi developers on my Quartex server.

This has a huge benefit: first of all those that want to test the Smart Desktop can do so from the same domain – and people who want to test Smart Mobile Studio can do so with myself just a PM away. Error reports etc. will still need to be sent to the standard e-mail, but now I can take a more active role in supervising the process and help clear up whatever missunderstanding could occur.

casebook

Always good with a hardcore Smart, Laz, amibian.js forum!

Besides that we are building a lively community of Delphi, Lazarus, Smart and Oxygene/Remobjects developers! Need a job? Have work you need done? Post an add — wont cost you a penny.

So why not sign up? Its free, it wont bite you and we can continue regardless of Facebook’s up-time this year..

You enter here and just fill out user/pass and that’s it:¬†http://quartexhq.myasustor.com/sharetronix/

Smart Pascal: Arduino and 39 other boards

January 17, 2017 Leave a comment

Did you know that you can use Smart Mobile Studio to write your Arduino controller code? People think they need to dive into C, native Delphi or (shrug) Java to work with controller boards or embedded SoC’s, but fact is – Smart Pascal is easier to set up,¬†cheaper and a hell of lot more productive than C.

Arduino getting you down? Why bother, whip out Smart Pascal and do some mild wrapping and kill 40 birds with one stone

Arduino getting you down? Why bother, whip out Smart Pascal and do some mild wrapping and kill 40 birds with one stone

What may come as a surprice to you though is that Smart allows you to write code for no less than 40 micro-controllers and embedded devices! Yes you read right, 40 different boards from the same codebase.

Johnny number five

jfiveNode.js is the new favorite automation platform in the world of hardware, and board’s both large and small either support JavaScript directly, or at the very least try to be JS friendly. And since Smart Mobile Studio support Node.js very well (and it’s about to get even better support),¬†that should give you some ideas about the advantages here.

The magic library is called Johnny number five (JN5) and it gives you a more or less unified API for working with GPIO and a ton of sensors. And it’s completely free!

As of writing I havent gotten around to writing the wrapper code for JN5, but the library is so clean and well-defined that it’s something that should not take more than¬†a day to port over. You can also take a shortcut via our Typescript importer and use our import tool.

Check out the API here: http://johnny-five.io/api/

Here is a list of the sensors JN5 supports:

  • Altimeter
  • Animation
  • Barometer
  • Button
  • Compass
  • ESC
  • ESCs
  • Expander
  • Fn
  • GPS
  • Gyro
  • Hygrometer
  • IMU
  • IR.Reflect.Array
  • Joystick
  • Keypad
  • LCD
  • Led
  • Led.Digits
  • Led.Matrix
  • Led.RGB
  • Leds
  • Light
  • Motion
  • Motor
  • Motors
  • Multi
  • Piezo
  • Pin
  • Proximity
  • Relay
  • Relays
  • Sensor
  • Servo
  • Servos
  • ShiftRegister
  • Stepper
  • Switch
  • Thermometer

And enjoy one of the many tutorials here: http://johnny-five.io/articles/

LDef Intermediate Language

January 13, 2017 Leave a comment

The LDEF bytecode engine is some time away from completion, but the IL source format that the assembler reads and turns into bytecode is getting there. At the moment there are only a few tidbits left to explore, like interfaces and generics, but those will be added.

It’s a real brain teaser because – some of the stuff that makes up a compileris not really related to code. When you think about generics you often make the mistake of thinking this is a code feature, like inheritance or virtual methods; it’s something that the code-emitter has to deal with or runtime engine to take height for. But generics is actually implemented¬†higher up. It exists between the parser and code-emitter.

Interfaces is another mirage or technological illusion. When you work with classes and interfaces you get a sense that it’s a solid thing, you know – you write a class and create objects and people imagine these objects as independent, material objects in a virtual space.¬†I tend to think of instances as molecules.

But objects is ultimately an illusion. Im not going to cover the intricate details of object orientation here, but OOP is actually about separating the data (fields) from the code acting on that data (members). So the only objects that really exist in the memory of a computer when you create instances, are buffers representing the fields of your class – combined with references and a list mapping the entrypoints for the members that makes up that instance (VMT). Compilers actually mimic object orientation by adding a secret parameter to all your methods, namely the “self” parameter. This “self” is a pointer to a record that contains all the information¬†pertaining to that instance.

class_makeup

Which is really cool because then you can create as many instances as you like – and they all use the same code. Which ofcourse object orientation is all about. But there is no such thing as an independent instance floating around computer memory. That is an optical illusion of sorts.

LDEF virtual machine

Virtual machines get’s to have all the fun. I mean, writing a compiler that emits real machine code is in itself not that hard, but generating a scheme that makes object orientation work and that keeps track of everything is. The x86 cpu architecture may be powerful but it’s an absolute bitch to work with. It has few registers (compared to ARM and PPC), several instruction sets and is known to be a somewhat unfriendly place. This is the reason that compiler makers tend to stick to a golden mean of instructions. Compilers like C++ builder and Delphi could actually generate faster and more efficient code if they knew exactly what cpu and architecture you used. But since that’s not how PC’s work and there are some 20 different cpu models on the market at any given time – with huge differences between AMD and Intel, it makes sense to use a safe collection of instructions.

LDEF is a bytecode runtime engine. And one of the perks of bytecode is that it’s very much abstracted from the whole issue of real-life assembly. Most programmers think that – if you make bytecodes then you dont have to think about low level stuff, which is cheating. That may be the case for other languages and engines, but not LDEF. In fact the whole point of writing a new engine is because I wanted a bytecode format that was closer to machine code. This is very important because at some point myself or someone else will write a JIT compiler for this, and if the runtime is to high-level or abstract, that is going to be very hard.

LDEF testbed

LDEF testbed

The LDEF instruction-set represents a golden mean of the average processor instruction set. I have taken instructions that most processors have, typical stuff like add, subtract, divide, modulus, multiply (and so on). Testing is where I have given LDEF some advantages, for instance when testing a list of parameters for conditions. Instead of keeping that as two instructions (test, branch [condition]) I have isolated that in a single instruction.

Let’s look at the instruction set so far:

  • move
  • blit – move memory block
  • valloc – allocate variable
  • vfree – free variable
  • malloc – allocate memory
  • mfree – release memory
  • add
  • sub
  • mul
  • muldiv
  • div
  • mod
  • moddiv
  • lsr – logical bit-shift right
  • lsl – logical bit-shift left
  • cmp – compare
  • tst -test
  • bne – branch not equal
  • beq – branch equal
  • ble – branch less
  • bgt – branch greater
  • jsr – jump sub routine
  • jmp – jump absolute
  • push – push to stack
  • pop – pop from stack
  • IFnb – test and branch if not equal
  • IFtb – test and branch if equal
  • throw – cause exception
  • syscall – invoke engine spesific, delphi code

One of the benefits of a virtual machine, is that you can do some magic with variables. In a real machinecode compiler, variable allocation and doing read / write operations can be very complex. But in a virtual machine you thankfully get to do something about that.

So LDEF allows you to move data seamlessly between variables and registers, which really is a massive time saver. The code standard also supports two levels of resources, global and local. The latter meaning class in this case. So there is ample room for high-level languages to store their data and implement classical features (like “resourcestring” in Delphi).

You also have support for constants, these are stored separately in the bytecode and loaded into a lookup-table associated with a class. Constants are different from resources in that they are designed to be quickly referenced. Constants has more infrastructure in the form of lookup-tables Рbecause they are meant to be used with the instructions. Resources are more like resource-strings in Delphi, they are stored in their own place in the bytecode and are loaded on demand. Constants are not. They are loaded into memory and are instantly available.

Having said that, the bytecode compiler also supports in-place data. Meaning that you can chose to write constant data where they are used. So instead of having to load a string constant (for example) before working with it, you can compile the string directly into the move instruction (or any other instruction).So this is actually perfectly valid:

move r0, "Hello world";

Other virtual machines, like the Java engine, force you to do this in two steps, which is slower:

lda r0, cost_id; // reference const by id
ldc r1, (r0); // get actual value into r1

You can also assign variables directly, you dont need to load the effective address first. So there is no extract cost involved in moving data between a register and variable, variable and variable, or resource / cost to a variable. This makes life much easier:

move V[$00F6], "String assignment";
move V[$01F0], 19875.32;
move V[$196], V[$197];

Registers

Another thing that is beneficial is that LDEF has 32 registers to work with (you can actually change that, so if you need 64 that’s no problem). How you use these is up to you, but it gives a lot of room for OOP schemes. For instance, if a language has a maximum limit of 6 parameters per method – you actually dont need to pass values on the stack at all (like Java does) but you can map parameters directly to registers.

Delphi, C++ builder and other native solutions tend to use stack-schemes to store information. So a pointer to some list is stored as the fourth item on the stack, and some other entry on the third (and so on). Which is perfectly fine and very effective on a real CPU (you really have no choice on a real x86). In LDEF you can now use spesific registers instead, which is a lot easier. What scheme you chose to use if naturally up to you – but at least the option is there.

Here are the registers LDEF presently supports

  • R0 – R31: general purpose registers
  • SP: stack pointer
  • PC: program control register
  • BC: branch control register
  • ES: exception state register

Optimization

If you are wondering why a virtual machine would support so much similar stuff, constants and resources, in place data or not, this all have to do with optimalization.

For example, if your compiler decides to collect all assignment values as constants, the¬†codesize of your program might be smaller; it all depends on what your code looks like. You will naturally consolidate identical strings, integer values and so on. So even if you have a program that writes “Hello world!” 10.000 times to the console, only one “Hello world!” will be stored in the bytecode file. So constants gives you smaller code, but at a cost of speed – since constants needs a lookup whenever they are used.

Now constants here are not “cost” declarations. Constants on this level is actually the values you write in plain-text in your program, stuff like this:

FMyText := 'Welcome to my program';
FCount := 100;

Both “welcome to my program” and “100” are constants. These are values that will never change, and compilers usually have to deal with these values in one of two ways. Both of which I have explained above (in place or not).

Source format

The LDEF intermediate format looks pretty much like C++. C is fast and easier to parse than other languages and it made sense to pick that.But this similarity is paper thin, in that only the constructs of C++ is used. Methods only constains the LDEF assembly language code, and variables are isolated in Alloc sections. The virtual machine takes care of the OOP layer for you so you dont have to worry about that.

Here is an example to give you a feel for it:

#include <stdio>;
#include <oop>;

struct CustomType
{
  uint32 ctMagic;
  uint32 ctSize;
  uint8  ctData[4096];
}

resources
{
  dc.s #welcome, "welcome to LDef";
}

class TBaseObject: object
{
  /* class-fields */
  alloc {
    uint8 temp;
    uint32 counter;
    CustomType handle;
  }

  /* Parser now handles register mapping */
  public void main(r0 as count, r1 as text) {
    enter { }
    leave { }

    alloc {
      /* method variables */
      uint32 _longs;
      uint32 _singles;
    }
    /* calc longs */
    move r1, count;
    mod r1, 8;
    move r2, count;
    move _longs, r1;

    /* calc singles */
    move r3, r1;
    mul  r3, 8;
    sub  r2, r3;
    move _singles, r2
  }

  /* test multi push to stack */
  private void _cleanup() {
     push [r0, r1, r2];
  }
}

Keep in mind that the format is not 100% finished yet, there are still a few tidbits that needs to be worked out. But the general system is firmly in place.

One of the cool things is that I get to add a few missing features from my favorite language, Delphi (object pascal). If you look at the first method (main), you will notice that it has two extra sections: enter and leave.

These sections can contain code that execute directly before and after the body of the method. In pascal it would look something like this:

procedure TMyClass.SomeProc;
  before
    writeln('about to execute someproc');
  end;

  after
    writeln('Someproc finished');
  end;
begin
end;

The purpose of these, especially the before() section, is to make sure the parameters are valid. So before is used to check that the parameters are within a legal range. After is naturally used to ensure that the result (for functions) is valid.

It also opens up for some interesting logging and debugging aspects.

More to come

This is turning into a long rant, but hopefully you found it interesting. I’ll keep you posted as the engine progress. There is still some way to go, but we are getting there. Once LDEF is implemented the fun starts, thats when I’ll code high-level parsers that targets the engine. First stop is naturally and object pascal compiler ūüôā

Delphi bytecode compiler

January 10, 2017 1 comment

This is a pet project I have been playing with on/off for a couple of years now. Since I’m very busy with Smart Mobile Studio these days I havent really had time to write much about this in a while. Well, today I needed a break for JavaScript – and what is more fun than relaxing with a cup of coco and bytecodes?

The LDEF bytecode standard

Do you remember Quartex Pascal? It was an alternative pascal compiler I wrote a couple of years back. The plan was initially to get Smart Mobile Studio into that codebase, and essentially kill 3 birds with one stone. I already have parsers for various languages that I have written: Visual Basic .NET, Typescript, C# and C++. Adding support for Smart Pascal to that library of parsers would naturally give us a huge advantage.

In essence you would just pick what language you wanted to write your Smart apps in, then use the same classes and RTL across the board. You could also mix and match, perhaps write some in typescript (which is handy when you face a class that is difficult to “pascalify” via the import tool).

LDEF test environment

LDEF test environment

But parsing code and building an AST (abstract symbol tree, this is a hierarchy of objects the parser builds from your source-code. Basically your program in object form) is only half the job. People often start coding languages and script engines without really thinking it through (I have done that myself a few times. It’s a great way to learn the ropes, but you waste a lot of time), after a while they give up because it dawns on them that the hard part is still ahead: namely to generate executable code or at least run the code straight off the AST. There is a reason components like dwScript have taken years to polish and perfect (same can be said of PAX script which is probably one of the most powerful engines available to Delphi developers).

The idea behind Quartex Pascal was naturally that you could have more than one language, but they would all build the same AST and all of them emit the same bytecode. Not exactly a novel idea, this is how most scripting engines work – and also what Java and .Net have been doing since day one.

But the hard part was still waiting for me, namely to generate bytecode. This may sound easy but it’s not just a matter of dumping out a longword with the index of a method. You really need to think it through because if you make it to high-level, it will cripple your language. If you make it to low-level – the code will execute slow and the amount of bytecodes generated can become huge compared to normal assembly-code.

And no, I’m not making it compatible with CIL or Java, this is a pure object pascal solution for object pascal developers (and C++ builder naturally).

LDEF source language

I decided to do something a bit novel. Rather than just creating some classes to help you generate bytecode, I decided to create a language around the assembly language. Since C++ is easy to parse and looks that way because of its close connection to assembler, I decided to use C++ as the basic dialect.

So instead of you emitting bytecodes, your program only needs to emit LDEF source-code. Then you can just call the assembler program (command line) and it will parse, compile and turn it into assemblies for you.

Here is an example snippet that compiles without problem:

struct CustomType
{
  uint32 ctMagic;
  uint32 ctSize;
  uint8  ctData[4096];
}

class TBaseObject: object
{
  /* class-fields */
  alloc {
    uint8 temp;
    uint32 counter;
  }

  /* Parser now handles register mapping */
  public void main(r0 as count, r1 as text) {
    alloc {
      /* method variables */
      uint32 _longs;
      uint32 _singles;
    }
    move r1, count;
    jsr @_cleanup;
    push [text];
  }

  /* test multi push to stack */
  private void _cleanup() {
     push [r0, r1, r2];
  }
}

The LDEF virtual machine takes care of things like instance management, VMT (virtual method table), inheritance and everything else. All you have to do is to generate the LDEF assembly code that goes into the methods Рand voila, the assembler will parse, compile and emit the whole shabam as assemblies (the -L option in the command-line assembler controls how you want to link the assemblies, so you can emit a single file).

The runtime engine is written in vanilla object pascal. It uses generics and lookup tables to achieve pretty good performance, and there is ample room for a JIT engine in the future. What was important for me was that i had a solution that was portable, require little maintenance, with an instruction set that could easily be converted to actual machine-code, LLVM or another high level language (like C++ that can piggyback on GCC regardless of platform).

LDEF instructions

The instruction set is a good mean of the most common instructions that real processors have. Conceptually the VM is inspired by the Motorola 68020 chip, in combination with the good old Acord Risc cpu. Some of the features are:

  • 32bit (4 byte) opcode size
  • 32 registers
  • 1024 byte cache (can be adjusted)
  • Stack
  • Built in variable management
  • Built in const and resource management
  • Move data seamlessly between registers and variables
  • Support for records (struct) and class-types
  • Support for source-maps (both from assembler to high-level and reverse)
  • Component based, languages and emitters can be changed

There are also a few neat language features that I have been missing from Delphi, like code criteria. Basically it allows you to define code that should be executed before the body of a procedure, and directly after. This allows you to check that parameter values are within range or valid before the procedure is allowed to execute.

Here is the pascal version:

function TMyClass.CalcValues(a,b,c: integer): integer;
begin
  before
    if (a<1) or (b<1) or (c<1) then
      Fail('Invalid values, %classname.%methodname never executed');
  end;

  result := a + b div c;

  after
    if result <23 then
      Fail('%classname.%methodname failed, input below lowest threshold error');
  end;
end;

I decided to add support for this in LDEF itself, including the C++ style intermediate language. Here it looks like this:

  /* Enter + Exit is directly supported */
  public void main(r0 as count, r1 as text) {
    enter {
    }

    leave {
    }

    alloc {
      /* method variables */
      uint32 _longs;
      uint32 _singles;
    }
    move r1, count;
    jsr @_cleanup;
    push [text];
  }

Why Borland/Embarcadero didn’t add this when they gave the compiler a full overhaul and support for generics – is beyond me. C++ have had this for many, many years now. C# have supported it since version 4, and Java I am usure about – but its been there for many years, thats for sure.

Attributes and RTTI

Attributes will be added but support for user-defined attributes will only appear later. Right now the only attributes I have plans for controls how memory is allocated for variables, if registers should be pushed to the stack on entry automatically, if the VMT should be flattened and the inheritance-chain reduced to a single table. More attributes will no doubt appear as I move forward, but right now I’ll stick to the basics.

RTTI is fairly good and presently linked into every assembly. You cant really omit that much from a bytecode system. To reduce the use of pure variables I introduced register mapping to make it easier for people to use the registers for as much as possible (much faster than variables):

  public void main(r0 as count, r1 as text) {
  }

You may have noticed the strange parameters on this method? Thats because it’s not parameters, but definitions that link registers to names (register mapping). That way you can write code that uses the original names of variables or parameters, and avoid allocating variables for everything. It has no effect except making it easier to write code.

LDEF, why should I care?

You should care because, with a bit of work we should be able to compile fairly modern Delphi source-code to portable bytecodes. The runtime or engine that executes these bytecodes can be compiled using freepascal, which means you can execute the program anywhere FPC is supported. So more or less every operative system on the planet.

You should care because you can now write programs for Ultibo, the pascal operative system for Raspberry PI. It can only execute 1 real program (your embedded program), but since you can now run bytecodes, you will be able to run as many programs as you like. Just run each bytecode program in a thread or play it safe and call next() on interval from a TTimer.

You should care because once LDEF is finished, being able to transcode from object pascal to virtually any language will be a lot easier. JavaScript? C++? Python? Take your pick. The point here is standards and a library that is written to last decades rather than years. So my code is very clean and no silly pointer tricks just to get a few extra cycles. Stability, portability and maintainance are the values here.

You should care because in the future, components like HexLicense will implement its security code in LDEF using a randomized instruction-set, making it very, very hard for hackers to break your product.

Build NodeJS from Source

November 17, 2016 Leave a comment
nodeJS rocks!

nodeJS rocks!

Since your Smart Mobile Studio programs can run from Linux under Raspberry PI, I figured I could write a few words about building from C/C++ source. Normally you don’t need to do this, but in case the process-manager (which makes sure your Smart NodeJS servers can be clustered and essentially controlled as services) can be picky about the version – you may suddenly find yourself needing the cutting edge.

Open up a command-line prompt and CD (change dir) to the location you want to keep node. Since this is so essential to my system I keep it at root (/), but you may want to stuff it in your home folder.

Either way, cd to the location then do:

sudo apt-get install build-essential
sudo apt-get install curl openssl libssl-dev

Odds are this is up to date (it was on my fresh Raspbian image). If not, let it finish (you may need to press “y” at some point) and let it work itself out.

You also need to install git (may also already be there):

sudo apt-get install git

With those in place, let’s do a clone of the node.js repository:

git clone https://github.com/nodejs/node.git

When its finished you should have a “nodejs” folder available. So we cd into that and do a configure:

cd node
./configure
make
sudo make install

Now building anything on the Raspberry PI is slow. So the “make” call will probably take 1.5 hours depending on your PI. If its overclocked you will probably get away with 45 minutes, but a bog standard Raspberry PI 3 is much slower.

When it’s all done, test it with

node --version

As of writing that prints out “8.0.0-pre” since im using the cutting edge pre-release version right now.

Now you can install npm (node package manager)¬†and them pm2 and enjoy the show ūüôā

Why C# coders should shut up about Delphi

October 18, 2016 Comments off

EDIT: Some 3 years after this satire post was published, a website called “.NetRocks” decided to make a number out of it. The satire clearly went over their heads, and the facts I outline in the post was met with ridicule. I’m a bit disappointed¬†with .NetRocks, because this could have been a great opportunity¬†for them to learn more about modern object-pascal development and it’s ecosystems. They seem genuinely baffled that Delphi and Object-Pascal in general was being used in 2016 (and 2019), and their behavior and response demonstrates the very arrogance and ignorance we have come to expect from “brogrammers” of their type.

I must admit that I’m somewhat shocked that a massive show like .NetRocks is utterly ignorant of the fact that Object-Pascal as a language has millions of users around the world. You have compilers like Oxygene from RemObjects that targets .Net, Java, x86, ARM and WebAssembly. So their whole podcast on “how to port Delphi applications to C#” is absurd, because that’s not how you migrate code from Delphi to .Net. The Elements ecosystem, which covers C#, Pascal, Java, Swift and GoLang installs straight into Visual Studio and adds Object-Pascal to the environment. You can also convert source-code between Pascal and C# through automation. Manually going about such tasks is amateurish to say the least, but I honestly didn’t¬†expect much when they open their podcast with verbal abuse.

vspas

While .NetRocks is manually porting pascal to C#, the rest of us use the Elements compiler and just compile the code to .Net Assemblies. You can also convert the code to any of the 5 languages Elements support. I have to wonder what rock .NetRocks lives under to be oblivious to this

Another point .NetRocks seem to have missed, is that Delphi has been aggressively optimized since Embarcadero took over from Borland. In 2016 Object Pascal ranked as #14 [2.164 score] on the Tiobi index over most popular programming languages [globally], just nine points below C#. So the most cost effective way of getting your legacy Delphi application modernized, is to just buy Elements from RemObjects, or upgrade your Delphi version to this century.

If they had bothered to check they would also have learned that I manage some 16+ developer groups on social media, with roughly 50.000 developers in total. I have also worked for Embarcadero that makes Delphi, so my statements while whimsical and satirical, were based on facts. I work with Fortune 500 companies in the US, as well as large companies around Europe, that all rely on Object-Pascal for their daily operations. Heck, even the Walt Disney company wanted to get rid of .Net in favour of Object Pascal in 2018. Turns out JIT is a disaster when coding against hardware interrupts and “old school” automation. Who could have guessed? (that was satire btw).

vsdelphi

Some people seem to think that Delphi ended with version 7 back in  the 90s. Delphi has been under heavy development since Embarcadero took over for Borland. .NetRocks doesnt even mention that buying a modern version of Delphi is an option companies should consider. And if they absolutely have to migrate to C#, then Elements is what professionals use.

On the whole, It’s alarming that the guys at .NetRocks seem clueless to the tools professional developers use for specialized work. Porting a Delphi codebase by hand its, well, amateurish at best. They might as well do a podcast on how to use ms-paint to create art, because Adobe Photoshop or Illustrator is not what people use (that was satire btw).

I am also pleased however, because .NetRocks unwittingly demonstrated my point to perfection ūüôā Instead of learning more about Anders Hejlsberg’s journey and life before he joined Microsoft, how the .Net class hierarchy is heavily influenced by the Delphi VCL framework, or how the concepts that ended up as .Net were in fact prototyped and experimented with earlier — they jumped straight to ridicule. Of a satire post (lol!).

You can click here for my full reply. And seriously, how it’s possible to mistake this old post as anything but heavy satire is beyond me. But I am thankful to .NetRocks for demonstrating my point. I could not have asked for a better example of arrogance and ignorance ūüôā Rock on!


 

13507225_10209662785317165_99874043994628045_n

When in Rome

Those that follow my blog or know me personally – also know that I don’t go out of my way to be confrontational or disrespectful. I try my best to follow the middle-path, to see positive in all things. But sometimes you face a level of stupid that begs a response.¬†A verbal one. And this is one of those cases.

Lately I see more and more Delphi developers getting into debates with¬†C# developers, and they are confronted with an attitude and belief-system that quite frankly is utter nonsense. It’s not based on history or facts but disputes rooted in childish “isms”. A mix of old habits and an unhealthy obsession with the notion¬†that new means better.

It is nothing short of an intellectual emergency.

Old is bad?

If that is the case then C/C++ should be utter shit, because C is 3 years older than Pascal. Pascal was created to replace C and get away from (amongst other things) the absurd and time-consuming header file practice, not to mention the recursive hell that still occur when headers reference other headers and conditional symbols pile up.

Do you know how curly brackets became popular? Jump into your¬†nearest Tardis and set the destination to the mid 1960’s. Back when mainframes were¬†the size of Portugal¬†and 1024 bytes of memory was the bomb. You see, back then memory was sparse and pretty much all the languages (or toolkits) needed to save space. So the { } curly’s were used as tokens for words like “begin” and “end”. Why? Because¬†one ascii byte is less than five ascii bytes. Every byte counts when you only have a few Kb of memory.

Ps: when I use the word toolkit here I really mean a random soup of asm snippets on a loom. The bones collected to make C was actually a mix¬†of homebrew toolkits made by different engineering teams at the time. And each of these toolkits had their own fanclub¬†/slash/ legal representation; because all of them, miraculously, had invented the exact same thing at the same time before everyone else. These guys kept on fighting over who owned what¬†well into the Watcom era. It’s probably the longest standing demonstration of a grown-up tantrum¬†since the roman emperor caligula outlawed clothing.

Fast forward to 1970¬†and the amount of memory¬†on mainframes had already doubled a few times just like Moore had predicted five years earlier.¬†The C standard that Dennis Ritchie had painstakingly wrangled from the cold, greedy¬†hands of grumpy old technicians¬†was due for an upgrade.¬†Tests were done by very dull, very serious men in grey clothing – and in their horror¬†they discovered that human beings would interact with words and phrases faster than¬†symbols and glyphs. So a word like “begin” would be picked up by the brain faster than {. It is just knockout stuff isn’t it.

You cannot praise Anders as a genius architect and at the same time throw his life work in the thrash. Perhaps you should ask yourself why Anders loved object pascal so much that he dedicated half his career making sure universities had access to it.

Living on the edge with a red tie, Nicolaus Wirth rocks the scene!

Living on the edge, Nicolaus Wirth rocks the scene!

Enter¬†Niklaus Wirth, a man so clever and academic that he doesn’t catch exceptions; He is the exception. But next¬†to his colleagues over at Berkley our dear Niklaus is¬†the proverbial rock-star. He decided that in order to make programming more accessible for students, easier to learn and above all, establishing a standard for safer code, that they should take science to heart and include that in a programming language design.¬†So he put on his gravest lab coat and began to implement “a better C” where he applied the research that words are picked up by the brain faster than tokens and symbols.¬†Nine months later pascal was born,¬†and to celebrate Niklaus used colored socks that day. A risky bit of business that¬†could have cost¬†him his seat¬†at the strangely brown sorority club, in which he was master of ceremonies. But¬†hey, Nikolaus is¬†a rebel and this¬†was the swinging 70’s after all.

My point? This is why pascal uses “begin” and “end” as opposed to { and }. It’s one of the many aspects of the Pascal language that makes it so easy to master.

Whimsical satire aside: with the knowledge that C is actually older than pascal, not to mention that it is identical in features, depth and complexity to Pascal–¬†what is it going to be? Do you still cling to the notion that older have to be of less value than new? Or could it in fact be that you haven’t really looked closer at object pascal to begin with? Have you even installed a modern version of Delphi and given it¬†a worthy test drive? Or do you just speak out based on what you like and know, as opposed to what you don’t know and haven’t even tried.

My 8-year-old daughter is like that. She makes up her mind about food before she has even tasted it. Surely a trained engineer can do better?

And I say that because —¬†if we follow your line of reasoning, C/C++ should be incapable of producing C#. How can something old and outdated possibly produce something new and exciting? You do know that C# is written in C¬†and assembler¬†right? And that Java and all the other high level languages out there didn’t spontaneously self assemble. They are all written in C, assembler or pascal. And there is a reason for that.

Just like nature have¬†different levels, from particles to minerals, from minerals to bacteria, from bacteria¬†to plants, from plants¬†to insects (and so on) — so does a computer. Each of these levels have laws and principles that govern them, and no matter how much you want to introduce something from a higher¬†level into a lower level – that’s just not how reality works.

foodchain

The fact that I have to explain this in 2016 demonstrates how education and academia has degraded after C and Pascal was replaced with Java.

The law is very simple: each level can only emit complexity upwards. So assembler is used to produce C and pascal, C and pascal is used to produce C++ and object pascal, C++ and object pascal is used to bring about every other single piece of tech you take for granted.

So when you criticize Delphi but use C#, you are just underlying that you dont really understand the difference between archetypical languages and context based languages.

It’s like saying Visual Basic 5 is cooler than assembler. You don’t debate with people like that, you just look at them with pity. I don’t know how many years a software engineering¬†¬†degree is elsewhere, but¬†if you just spent 3 years in college and 2 years at a university followed by one or two¬†additional years getting your diploma¬†–¬†and this is the level of insight you graduate with then I pity you. And I pity your employer because sooner or later you will bring about the apocalypse.

If we apply rational logical thought to the proposition at hand, we cannot but conclude that¬†the notion of “new is always better” is false.¬†It is to mistake marketing for facts and capitalism for science. The fundamental principles of computing wont change because you favour a high-level programming language. The C/C++ embedded market wont magically roll over, because C# is a mitigated disaster on embedded devices. And I speak from experience, not preference.

Your teacher should have taught you this: “In a computer, like nature, the past is always alive“. Look closely at your brand new PC or Mac: Bios, Vesa screen modes, non linear memory, hardware interrupt vectors and a cpu that supports no less than 3 instruction sets. You have layer on layer of increasing complexity (a.k.a “the past is still there”). Why not call up Linus Torvalds and ask him why he’s not using C# in his line of work; or NVidia why their GPU pipeline only ships with C headers and not C# classes. I sure as hell wouldn’t want to be on the receiving end¬†of that call.

The only C# compiler worth using, is actually RemObjects C# (the Elements compiler). Because unlike mono or ms’s compilers, Elements builds proper binaries on the same level as C/C++ and Pascal. You can actually write kernel modules with Elements if you like, something you cant do with Mono or Visual Studio. Heck, .net services still require a service host on Windows.

C# is better than Delphi?

Anders Hejlsberg, the father of many languages

Anders Hejlsberg, the father of many languages

Actually, it’s not.¬†What today is known as .net, from its¬†CIL intermediate language right down to the global assembly cache¬†is pretty much Delphi with a new syntax parser on top. Yes you read that correctly the entire .net product family is¬†practically Delphi refactored. The whole bytecode regime was one of the last skunkwork projects Anders worked on, and was actually described on Borland’s news-servers years before it magically re-appeared as Microsoft’s flagship product.

Anders Hejlsberg created and prototyped these technologies (as did many other companies, since Java made bytecodes cool) while he was working at Borland. And indeed, the father of C# and entire dot net stack is none other than the father of Delphi.

At Borland¬†(in the borland.language newsfeed¬†I vividly remember) these technologies went mentioned under¬†the codename “portable Delphi”. I followed the thread on the¬†news servers for quite some time. It was no secret that Borland was tinkering away on the next big thing, and I believe it was ultimately Borland’s response to Java. Because at the time Java had eaten quite a chunk of Borlands educational seat licenses. And had Anders continued working for Borland, Delphi and C++ builder would¬†perhaps be bytecode based today.

Factoid: The first project Anders was assigned to when he joined Microsoft, was J#. Microsoft’s attempt at hijacking the Java language. Microsoft lost the legal battle and was forced to pivot. The end result would be C#.

So when you, the C# fanboy, propose that Delphi is somehow inferior to dot net, or that object pascal is outdated and technically behind¬†C# you quite frankly have no clue what you are talking about. Is it generics? We have that. Is a rich and powerful RTTI? Got that covered. Anonymous procedures? Its there too. In fact, the whole misunderstanding here stems from the fact that C# developers generally think “Delphi 7” is what Delphi is all about. Which is the same that saying that .Net¬† is what came out in version 1. Who the hell uses Delphi 7 in our day and age.

But just to make this crystal clear: Anders Hejlsberg is not just the father of C# and dot net: he is also the father of Delphi. And before that he gave birth to Turbo Pascal. In fact, Anders have 3 (if not four) Pascal development systems behind him before he came to Microsoft.

Borland, land of the free

Borland, the company Anders worked for, was put out to pasture by Microsoft. Bill Gates launched a financial onslaught that few companies on the planet could have endured. The reason was that Microsoft really had the worst developer tools on the market (Visual Basic and Visual C++), while Borland represented the very best. So for a long time Borland utterly demolished Microsoft when it came to software development.

When Borland later began to flirt with Linux (Delphi Kylix) it must have made the R&D department at Microsoft piss themselves. Delphi is responsible for thousands of desktop applications synonymous with Windows – and if Linux users got their hands on RAD tools like Delphi, with thousands of applications ready to be ported over? Do the math. You have to remember that this was before cloud computing.¬†Microsoft didn’t have¬†a fallback strategy and depended on desktop and server sales. Linux could do everything Windows could, but it lacked the desktop applications, the user friendliness and features Borland made simple.

Weird Science, a fun but completely fictional movie from 1985. I just posted this to liven up an otherwise boring trip down memory lane

Wishing for stuff doesn’t make it true, except in the movies

What Microsoft did was, basically, to buy out Anders (they also picked up a huge part of Borlands R&D department) from Borland and at the same time attack the company from every angle. Drowning them in bullshit lawsuits, patent claims and all the nasty, cowardly practices Microsoft was into back then (we all remember Netscape and how that went).

Bill Gates called Anders up personally¬†and gave him “an offer I could not refuse“. Whatever that means.

So to put this myth to bed and be done with it: C# as a language and dot net as a technology did not originate with Microsoft. It has Borland written all over it and its architecture is intimately joined at the hip with Delphi and C++ builder. Companies like Microsoft dont innovate any more, they buy competence and re-brands it as their own.

Dot Net framework, you mean the VCL?

If that is not enough, take a long,¬†hard look at how the .net framework is organized. Pay special attention to the features and methods of the base-classes and RTTI. What you are looking at is the VCL, the visual component library, Delphi and C++ builder’s run time library.

If you know Delphi and it’s RTL intimately as well as C#, you quickly recognize the architecture. The syntax might be different, but the organization and names are all too familiar.

There was never any secret where this came from, but it’s not advertised either so I don’t blame people for not recognizing it. But when people start to behave nasty over what should be either a passionate hobby or a professional discipline, then truth must¬†be told.¬†The very foundation of the dot net framework is taken straight out of the VCL.

And why shouldn’t it? The whole platform was whispered about under the codename “portable Delphi” to begin with, it was even designed by the same author – so why should it come as a surprise that it’s basically Delphi reincarnated with a different syntax parser? Either way a few bells should start ringing in your head – telling you that you may have been wrong about Delphi and object pascal in general.¬†Probably because you know very little about Delphi to begin with.

So, what you perhaps believe sets C# apart from Delphi, what you imagine gives dot net an advantage over Delphi Рis in reality the opposite: it is Delphi. It originated in Delphi, was inspired by Delphi and flows straight from the inventor of Delphi himself. Anders is responsible for two distinct object pascal development platforms prior to working for Microsoft. Take some time to reflect on that.

You cannot praise Anders as a genius architect and at the same time throw his life work in the thrash. Perhaps you should ask yourself why Anders loved object pascal so much that he dedicated half his career making sure universities had access to it.

Strength in diversity

The fact that C/C++ is alive and well today serves to prove my point: age has no meaning when dealing with fundamental technology. These languages embody the fundamental principles of computing. They don’t grow old because they represent¬†a fundamental level that can never be replaced. There is now law that says C must be there, what persists is ultimately the features that C brings – spanning from assembly to the desktop. Those same features can be found only in one other language, namely Pascal.

Things are the way they are for a reason. And programmers before you established standards that address things you might not have to deal with, but that’s because they dealt with it before you. Just because you don’t see the plumbing doesn’t mean it’s not there.

These are the compilers responsible for the foundation on which all other languages, operating-systems, applications and services rest. Remove those, and the whole house of cards come crashing down.

So C/C++ and object pascal continue to exist, thrive even, precisely because they interface with the hardware itself, not a virtual machine or bytecode format. Delphi as a product is naturally tailored for operating-systems, as is C/C++. But that is because we use programming languages to create applications. There is nothing in the way of someone using compilers like freepascal and writing an operative system from scratch.

Ultibo is an ARM operative system written purely in object pascal

Ultibo is an ARM operative system written purely in object pascal

By contrast,¬†C#, Java and Visual Basic are context dependent languages. They exist purely in context with an operative system or run-time environment. They can’t exist without these and lack the depth C and pascal brings to the table. The depth to span many levels, from the lowest all the way up to cloud and domain driven programming.

Just look at Android,¬†an operative system written in Java right? Well¬†that’s not really true. The Android operative system depends on a native bootstrap, a piece of code that establishes the run-time environment for Java applications. So when you fire up your Android phone, tablet or set-top box, the first thing to boot is a loading mechanism written in vanilla C. This in turn mounts a filesystem and continues to loads in bunch of native drivers. Drivers for the display, storage, wi-fi and all the hardware¬†components that ultimately make up your SoC (system on a chip).

Its only after this long process that the environment is capable of hosting the Java virtual machine. Then and only then is the device capable of executing Java. This is why languages like Java is called context-based, because they run in context with an established system.

7709-f-1

Notice the “C” part of the Java NDK? Wasnt that exactly what Java was supposed to solve? All that hype and here you are, back at where you started!

So Java, which for the past decade or more have bombarded us with advertising like “platform independence” is a total flop!¬†Its like communism and looks good on paper, but once you start testing¬†these ideas in real life – they just don’t work.

Turned out that the java virtual machine when running on small devices like phones and tablets, choked the CPU. Which in turn used more electricity to keep up, which in turn rendered the battery life of these devices null and void. Talk about irony. All that hype and Java couldn’t even deliver¬†its most basic promise: platform independence. And you removed C and pascal from our universities for this? It’s a disaster en-par with the burning of the great library of Alexandria.

There is more than a drop of ego and arrogance in this material. Young people especially like to believe they are smarter than those who lived before them. It’s a very selfish idea that really need to be addressed. Take someone like Alan Turing for example, the inventor of what we call computers today. This man assembled and created the first computer by hand. He even hand carved the wheels that made the mechanism run (which was mechanical during the second world war).¬†I would happily bet¬†$1000 that Alan could out program¬†the lot of us had he gotten the chance. He has been dead for many years, but to imagine yourself smarter or more capable than him would be the definition of arrogance.

PS: This is where we come into the realm of wisdom versus knowledge.¬†The difference may be subtle, but it’s so important to distinguish between them.

C# would have been a sad little language without the work of this man, Miguel de icaza

C# would have been a sad little language without the work of this man, Miguel de icaza

Either way without the native context to support it Java turned out to be a blundering behemoth. And its the exact same with C#. This is why Miguel de Icaza work on Mono has been so instrumental to the success of dot net and C# in particular. It was Miguel and his team that pushed C# into native compilation versus JIT for mobile devices. C# would never have been allowed on IOS or Android otherwise.

So that is the difference between archetypical languages and context based languages. Hopefully you now see why this negative attitude towards Delphi and object pascal doesn’t make sense. It really is just as stupid as¬†these cults that believe in dinosaurs and that human beings once lived side by side with a T-rex.¬†Anyone with a highschool diploma¬†should instinctively know better.

The problem is when people in great enough numbers start to believe in this nonsense, because then they can actually damage important mechanisms of our eco-system. Insane beliefs really is dangerous both socially and technically.

Grow up!

I have no problem using C# and consider it a valuable tool in my toolbox. Same with Delphi. But when it comes to kick-ass desktop applications for Windows and OS X, Delphi has no rival. It gives you a productive edge that I havent found in any other language. And I don’t base that on preference. I base that on engineering and the years of development invested in Delphi by both Borland and Embarcadero.

I just find it odd that C# developers seem to lack the ability to speak well about Delphi, because it’s a tell-tell sign of immaturity to be incapable of that. Object pascal and C++ is identical in every way (except multiple inheritance, which even C++ programmers shun like the plague so no loss there). Why then should educated programmers find it reasonable to praise C but thrash¬†pascal? It’s like saying that one car is faster than another – but when you pop the hood they have the same engine. The exact same machine code generator even. ¬†It is psychological masturbation. We are coders, we shouldnt do things like that. We should examine the system carefully and then make up our minds.

And what is a programming language anyway? It is the ability to describe the solution to a problem. The ability to define and control how a series of activities execute. Each language represents a methodology for solving problems. A mindset. Nothing more.

Languages like C, C++ and object pascal will always exist, they will always thrive, because they¬†are instrumental to everything else (desktop, operative system, services, everything that makes a computer capable of what it does). It doesn’t mean that Delphi is the solution to all problems, far from it. And¬†that is ok.

Context based languages are more often than not designed for a purpose. Java was initially designed as a network language. This makes Java infinitely more suitable for dealing with that line of work. You can do the exact same in C but you would have to add a few libraries and establish a platform. Which is exactly what C and pascal is good at, implementing fundamental technology. Object pascal is a language you would use when creating other languages. C is more popular for virtual machine development, no doubt there, but there is nothing in C/C++ that you wont find in object pascal.

Interestingly, languages that many¬†believe is long since extinct, like Cobol, are actually alive. Cobol was created to deal with large stacks of numbers and to push associated data through formulas. That is its context and no other language is better suited for the banking sector than Cobol. Which incidentally is the one place where Cobol is used¬†to this day. And why not? Why waste millions trying to hammer in a nail with a hacksaw? Why not use a language especially designed for that purpose? SQL is likewise a language especially tailored for its tasks. I don’t hear anyone bitching about how old and outdated that¬†is. It wont grow old because it embodies the principle of relational data.

And just what exactly is old about Delphi? Generics? Posix support? Weak and strong references? Garbage collection? Attributes? class, record and enum helpers? Dependency injection? Model view controller frameworks? The ability to write assembler side by side with pascal? The ability to link with C/C++ objects? Cross compilation for Windows, OS X and mobile platforms? What manner of logic are you using where you praise other languages for these features, but for Delphi the same features makes it old? Stop repeating what someone once told you like a mindless parrot and investigate. You are an engineer, a field of science; start acting like one.

In my work I use several languages: JavaScript, Delphi, Smart Pascal, Mono C# and even Python. I have also fallen in love with LUA believe it or not.

For web-pages I prefer Smart pascal which compiles to JavaScript (node.js is fantastic). Then you have shell scripting, a touch of x86 assembly code, a dash of bash on my Linux box and .. well, there is strength in diversity.

It all depends on what is best for the customer and what language yields the most robust solution. Sometimes C# is better suited for my clients infrastructure, other times Delphi makes more sense. But if I was to pick a single language to cover all of them, it would either be object pascal or C++ (gcc). These really have no rivals and will never go away.

These are the compilers responsible for the foundation on which all other languages, operating-systems, applications and services rest. Remove those, and the whole house of cards come crashing down.

So using many languages is actually an enriching experience. It brings more depth to our way of thinking and (consequently) affect the quality of our products.

Anyways, I believe¬†congratulations are in order, because if your favorite language is C#, you have in fact been using Delphi without knowing it. There is so much Delphi in C# and they both originate from the same author. So why not pay a visit to C#’s older brother?

So go download Delphi, Lazarus and Remobjects Oxygene – enjoy the decades of research and excellence¬†these dialects represent. It won’t hurt you, it will probably do you a great deal of good.

3D mathematics [book]

September 11, 2016 Leave a comment

It’s not often I promote books, but this time I’ll make an exception: Mathematics for 3d programming and computer graphics.

Sooner or later, all game programmers run into coding issues that require an understanding of mathematics or physics concepts such as collision detection, 3D vectors, transformations, game theory, or basic calculus

A book worth every penny, even if you dont use 3d graphics very often

No matter if you are a Delphi programmer, Smart Pascal,¬†Freepascal, C# or C++; sooner or later you are going to have to dip your fingers into what we may call “primal coding”. That means coding that was established some time ago, and that have since been isolated and standardized in APIS. This means that if you want to learn it, you are faced with the fact that everyone is teaching you how to use the API — not how to make it or how it works behind the scenes!

3D graphics

Once in a while I go on a retro-computer rant (I know, I know) talking about the good ol’ days. But there is a reason for this! And a good one at that. I grew up when things like 3d graphics didn’t exist. There were no 3d graphics on the Commodore 64 or the Amiga 500. The 80’s and early 90’s were purely 2d. So I have been lucky and followed the evolution of these things long before they became “standard” and isolated away in API’s.

Somewhere around the mid¬†80’s there was a shift from “top-down 2d graphics” in games and demo coding. From vanilla squares to isometric tiles¬†(actually the first game that used it was Qbert, released in 1982). So rather than building up a level¬†with 32 x 32 pixel squares – you built up games with 128 degrees tilted blocks¬†(or indeed, hexagon shaped tiles).

This was the beginning of “3D for the masses” as we know it because it added a sense of depth¬†to the game world.

qbert

Qbert, 1982, isometric view

With isometric graphics you suddenly had to take this depth factor into account. This became quite important when testing collisions between sprites. And it didn’t take long before the classical “X,Y,Z” formulas to become established.

As always, these things already existed (3D computation was common even back in the 70s). But their use in everyday lives were extremely rare. Suddenly 3d went from being the domain of architects and scientists Рto being written and deployed by kids in bedrooms and code-shops. This is where the european demo scene really came to our aid.

Back to school

This book is about the math. And it’s explained in such a way that you don’t have to be good in it. Rather than teaching you how to use OpenGL or Direct3D, this book teaches you the basics of 3D rotation, vectors, matrixes¬†and how it all¬†fits together.

Why is this useful? Because if you know something from scratch it makes you a better programmer. It’s like cars. Knowing how to drive is the most important thing, but a mechanic will always have a deeper insight into¬†what the vehicle can and cannot do.

vector

Every facet is explained both as theorem and practical example

This is the book you would want if you were to create OpenGL. Or like me, when you don’t really like math but want to brush up on¬†old tricks. We used this in demo coding almost daily when I was 14-15 years old. But I have forgotten so much of it, and the information is quite hard to find in a clear, organized way.

Now I don’t expect anyone to want to write a new 3D engine, but 3D graphics is not just about that. Take something simple, like how an iPhone application transition between forms. Remember the cube effect? Looking at that effect and knowing some basic 3D formulas and voila, it’s not a big challenge to recreate it in Delphi, C++, C# or whatever language you enjoy the most.

I mean, adding OpenGL or WebGL dependencies just to spin a few cubes or position stuff in 3D space? That’s overkill. It’s actually less than 200 lines of code.

Well, im not going to rant on about this — this book is a keeper!
A bit expensive, but its one of those books that will never go out of date and the information inside is universal and timeless.

Enjoy!

 

 

When things dont work out

June 6, 2015 1 comment

With all respect, love and .. good stuff i have for Delphi. I love it to bits, but this sketch¬†from faulty tower’s is the epic image of myself going apeshit when¬†I managed to crash my iPad2 due to “overly optimistic expectations of GPU performance”. It’s also my natural response to C# ūüôā

Come on you vicious bastard!

Come on you vicious bastard!

Link also available here

Why do some people hate Delphi so much?

May 22, 2015 27 comments

I have to admit that all the rubbish snag comments about Delphi is really annoying. Why does seemingly intelligent, clever and open-minded individuals hold some grudge or negative judgement regarding Delphi?

Where in God’s name does this come from? And who in their right mind, being grown men in their 30’s, 40’s and 50’s, would start acting like little children like this?

I see it almost everywhere. Even in my last job where we had two teams working with C# and Delphi, joining forces to produce a top quality product. Yet behind closed doors there existed a kind of silly superiority complex amongst the C# users which I found disrespectful, superficial and outright stupid.

What is really a shame here is that almost everyone I have met holding such a dumbfounded attitude, which keeps on thrashing Delphi and object pascal in general, is that they demonstrate absolutely no idea¬†what they are talking about! Some have tried Delphi in past and found it to complex, some never liked it — and all of that is ok of-course. We can’t all be expected to like the same music, food, art or colors – so different programming languages are no exception. But when individual after individual turns out to be just mindless parrots repeating what other parrots are repeating – I just have to set the record straight.

First things first

The first argument to fall flat on its face against Delphi is the one regarding age. “Oh it’s just old crap” I hear people say, “it’s not modern enough”, or even more humiliating (for them), it’s not cutting edge like C++, C# or Java!

The latter argument makes me want to throw up. Literally. Because what these sad individuals¬†fail to recognize, is that C is about six years older than Pascal. Pascal was designed to replace C, because C was never intended to be in its present state to begin with. Like JavaScript the syntax was never finished but a result of old habits, deadlines and a moving market.¬†C is a language made as they went along, it’s not planned or thought out like Pascal, C# or Java.

Back in the sixties and seventies programming languages were very crude;¬†each command was just¬†replaced by a fixed block of machine code, with various macro mechanisms to set parameters. So in the first pass the compiler made checked the syntax, the second replaced your “c” code with a blocks of assembly code, the third generated an entry-table for each sub-routine and it’s initializer (hidden routines for allocating static memory) and the fourth pass emitted the now pure assembly code text-file to disk. At the end of this horrible process which could take hours, an assembler was invoked to turn each .asm text-file into machine code .a files. And the last step was the linker gluing each piece together into a working¬†program.

As such “C as a language” was never really finished. C was just a toolkit engineers used when creating operative systems, rom images and software for mainframes. Whenever they needed a new command they just extended the language as they saw fit.

So it was never cleaned up properly; the syntax was never renamed to be more understandable, the curly brackets { }, which were only used to save precious bytes of memory were not replaced with begin or end keywords, and the externalized header files were left as they were. An echo from the past when computers had so little memory that it made sense to parse line-by-line on disk rather than wasting ram. And when I write disk, I mean tape.

To make the chaos complete take into account that there were around 8¬†different “C” dialects on the market. Just like there are¬†myriads of Basic compilers which all have their own take on what Basic really means.

Education, universities and engineering

So, with the state of the C language in mayhem, competing¬†dialects, compilers, calling conventions and so on — a group of people got together to create “a better C” language. I can imagine that trying to tell X number of companies to do as they say was met with little success. Once people have mastered a language they tend to stick with it, it’s always hard to introduce change, especially if you have nothing but an idea on the menu.

So what the pascal group did was to grab the features C had to offer,¬†humanizing the syntax, polling the header back into the unit file (so no external .h file are needed), decide on registry calling convention as standard (hence we have “pascal” calling convention today, even in C# and Java) — and the result was Pascal.

The purpose of humanizing the language (e.g replacing { } with “begin” and “end” keywords) was for¬†good reasons:

  • To improve readability of¬†complex low-level code
  • To make it easier to teach at universities for students with no computing background
  • Take height for the fact that the human brain recognizes words made up of letters, faster than non alphabetical characters ¬†(*1)
  • To better be able to assemble a curriculum fit for tutoring, which was problematic with C due to the large differences in dialect, competing standards and complexity of operators.

[*1]: Actually the brain sees entire words as symbols and are able to recognize and look-up it’s relevance and meaning almost instantaneously. Whereas non characters like “{” or “}” require the brain to include the characters before and after in order to deduce it’s context before looking it up.¬†In short:¬†The brain reads human words faster than glyphs.

The result of many years of hard work was Pascal, designed to be a better C without the limitations which haunts C to this very day. The results should speak for itself: for 30 years Pascal code has outperformed C in almost every aspect of programming. During the 80’s and 90’s Pascal was the language of choice for game and system developers, especially since Pascal has always supported inline-assembler.

Just like C, Pascal has been uses to create operative systems from scratch. Recognized by all for its simplicity, its rigid structure, it’s efficiency and most importantly: it’s productivity!

Let’s go back to the beginning

Before we look at object pascal today, let’s go back to the early days. Back when there were at least 8 different C compilers (not to mention a few open-source, free compilers available as well) and there really was no control or standard at all.

So, in 1978 Dennis Richie and Brian Kernighan published a book called “The C programming language”. A lot of kids these days seem to believe that this book marks the beginning of C, and that it sort of defines the language from beginning to end. Sadly that is not true, I wish it had been true, but it’s not.

When Dennis and Brian wrote that book the C language was in mayhem. There was no standard at all, nothing to say what was C and what was Basic or BCPL. There was no organization which owned C and with all those different¬†compilers on the market, each of them different, each of them unable or unwilling to support the other — something had to be done.

Hence the book “The C programming language” was released as an attempt to clean up the mess.¬†That meant creating a standard or subset which all C compilers should support – regardless of their peculiarities. As long as every compiler supported the proposed sub-set, they could call themselves “ISO C compatible” and years later “Ansi C compatible”. Sadly this process took many, many years and the ISO standard was not effective until¬†1982 or thereabouts.

Object Pascal = C++

Like mentioned above Pascal was a project to create a better C. What people don’t seem to get is that pascal is, underneath the syntax, almost identical to C. Most of the improvements were not binary but rather superficial, a more thought out and easy to read syntax, no need for external header files, support for inline assembler, the use of $ rather than 0x for hexadecimal — essentially superficial changes.

So when people thrash talk Pascal or Object Pascal, they can’t possibly know what they are doing. It is a fact that C++ builder and Delphi shares a common core. This is why you can take .obj files compiled with C++ builder and link them straight into a Delphi project. And you can likewise take a Delphi compiled unit and use that in a C++ builder project.

This should give you some sense of what object pascal is capable of. If you have respect for languages like C/C++ then you should also recognize that Delphi and freepascal is equally worthy of the same respect. The only thing separating the two languages are superficial, syntactical differences. Naturally each language has evolved since the 80’s and 90’s and Delphi have many features which C/C++ lacks (and visa versa). But inside the heart of Delphi beats the same engine which powers C++.

Back it up with facts or be silent

“You cant use delphi, thats just .. you cant”

Why? I hear people utter these statements, but I have yet to hear a good reason as to why? Why can I use Visual C++, C++ builder and even Visual Basic (!!!) to write applications but not Delphi? Since I have outlined that Delphi and C++ builder share the same core, under what pretext should Delphi be banned while C++ is allowed? And who in their right mind would allow Visual Basic, which is a language for children, yet ban the most widely used language in the western hemisphere for the past 30 years? Is there an insanity virus going around?

Let’s be honest: all of these statements are of-course rubbish. Java and C# fan-boys will always tell you that their language is the best, and they fight like cats and dogs between themselves too. So when Borland lost the economic fight against Microsoft, they all had a chance to kick Delphi and C++ builder¬†while they were down for the count. But what company ¬†would really stand a chance against Microsoft?

Microsoft is a company so spiteful and egocentric that it created C# when Sun Microsystems refused to give up control over Java. And just look at the damage C# has done to Java in just a couple of years!

Does that mean that Java is “crap”? Does it mean that C# should be allowed while Java should be banned? Or should we allow Visual Basic but refuse to accept Java? Because that’s the kind of insane thinking we are dealing with here.

Of-course A doesnt exclude the use of B. And had Microsoft really gone after Sun with the intent of destroying them, you know just as well as I do that Sun would be dead and buried — and Java along with it.

So before you start thrash-talking Delphi and Borland and Embarcadero, please do yourself the favor of actually knowing what you talk about. Because to people who know several languages, people who were developers 20 and 30 years ago, you sound like a complete moron. Especially when you say that Delphi is old, or Delphi is just “this” or “that” like a child lobbying for his favorite soccer team. At least have the wits to add why you believe Delphi is crap, and then investigate¬†if that notion is fact or fiction.

Did Borland get their asses kicked by Microsoft? Yes they did. Just like Netscape did and thousands of other companies over the years.

Is Delphi dead? Absolutely not!¬†It’s used by thousands of high-end companies on a daily basis all around the world. In fact since Borland sold it’s R&D department to Embarcadero the language has once again caught up with C++, and the user base is growing! It goes without saying that when Microsoft stole the lead developer and inventor of the product, and essentially left Borland¬†to the volves¬†that development stopped for a while. But that was years ago and its ancient history.

And just like C++ compiled Delphi has no dependencies. You don’t need to install a truckload of DLL files to execute your programs. They are native assembly code so they run much faster than Java and C# (roughly 30% faster than C# JIT code). So what makes more sense? Ignoring Delphi and keep using C# for high-priority servers? Or be smart and use Delphi to get that extra speed boost. 30% means a lot when dealing with millions of records per day, yet I see companies falling for the hype and picking C#, only to be haunted by more problems than I care to list.

And you also get to use freepascal absolutely free. You can compile your programs for Linux, Macintosh, Windows — you can target Nintendo, XBox and Playstation. You can target iPhone and Android devices — and even a few novel operating systems like QNX, Embedded DOS, AROS, MorphOS and Amiga OS (although dont hold your breath if you plan to make money on Amiga these days).

And then there is experience and coverage. It goes without saying that a language which spans back to the early days of C++ has gone through some vigorous tests and stood the test of time! Delphi as a product comes with a broad background and public library of code for almost everything under the sun. And the Delphi codebase is still growing!  Why should your company pay millions to hire people to re-invent the wheel in Java or C# when you can get an experienced Delphi developer to deliver faster solutions using code which has been running successfully for a decade?

Think about that for a while before you give in to weak-minded mantra of “mine is better than yours”, because odds are your language of preference can quickly find itself in a similar state a few years from now. Besides — a programming language represents a school of thought, and just look at the hundreds of thousands of software titles created using Delphi. Look at the millions of software problems Delphi has helped solve! To deny that is to quite frankly be thoughtless. Wisdom and knowledge is valid, no matter what language is used to express it.

Hire a Delphi superhero!

May 18, 2015 2 comments

This is your chance to hire a Delphi super-hero, namely me! That’s right, I’m presently in the market for a new full-time job!

Note: if you need people for a short project then I personally must decline, but I can forward your position at Delphi Developer (4.000 Delphi developers) and here on my website (average of 15.000 visitors per month)

I must underline the obvious, namely that I live in Norway with no intention of moving. However, I have a fully established office which has been used for three consecutive years. So I am used to working remotely, communicating through Skype, Messenger, Google Meetup’s and similar utilities.¬†This has worked brilliantly so far. Full time employment over the internet is finally a reality.

For Norwegian companies I can offer 50% commuting (100% in Vestfold) from Oslo to Akershus in the north, to Sandefjord and Larvik in the south. The reason I can only offer 50% is because I have custody for two children every other week, which means I am unable to drive long distances in combination with school and similar factors.

You need an awesome Delphi developer? Then I'm you man!

Jon Lennart Aasenden, 41 years old senior software engineer using Delphi

My experience is too long to list here, but if you are in the market for a dedicated software engineer then you have come to the right place. Simply drop me a line on e-mail (address at the end of this post) and I will get back to you with a full resume ASAP.

In short: I have worked with Delphi for 15 years, I am the author of¬†Smart Mobile Studio and wrote¬†the VJL (run-time library, similar to VCL) from scratch. I have worked for some of the biggest companies in Norway. I have good papers, excellent reputation and is recognized for my skill, positive¬†attitude and “yes we can” mentality.

Before my 15 years of Delphi I spent roughly the same amount of time programming¬†games, multimedia and contributing to the demo scene. In my view, demo programming is one of the most rewarding intellectual exercises you can engage in. It teaches you how to think and prepares you for “real life solutions” as a professional.

In my career as a Delphi developer I have written several commercial products; ranging from a complete invoice and credit management application, games and multimedia products, serial management components for Delphi, backup clients and much, much more.

No matter what your company works with, as long as it’s Delphi I’m the right man for the job.

My kind of job

It goes without saying that I am looking for a purely Delphi centric position. Although I have worked with C++ and C# and have no problems navigating and adding to such a codebase — there can be no doubt that Delphi is the environment where I¬†excel at my work.

I bring with me a solid insight into HTML5/JavaScript and am willing to teach and educate your existing staff in Smart Mobile Studio, which I am also the founder and inventor of; bringing your team up to speed with the next step in computing: namely distributed cloud computing. At which JavaScript and Object Pascal play central roles.

Torro Invoice

Torro Invoice

So if you are moving towards cloud computing or want to leverage the cloud for service oriented architectures using RPC or REST servers, I am the guy you want to talk to. Especially if you want to use object pascal and leverage nodeJS, JavaScript and the whole spectrum of mobile devices. In short, with me on your team you will have a huge advantage.

What you get is 15 years of Delphi experience; and before that, 15 years of software authoring, demo and game development on 16 bit machines.

Good qualities

I have included a small list of “ad-hoc” qualifications.¬†The list below does not serve as anything other than indicators. I have been a professional programmer¬†for many, many years and to list every single piece of technology¬†is¬†pointless.¬†But hopefully the¬†list is enough to give you an idea of my qualifications, depth and diversity.

  1. Write custom Delphi controls and UI widgets
  2. Create control packages and resource management for our projects
  3. Optimize procedures by hand by refactoring variables, parameters and datatypes.
  4. Write a database engine from scratch and also use existing engines such as Elevatedb, MSSQL, Oracle, Firebird, SQLite and MySQL
  5. Work with Windows API directly
  6. Work with freepascal and port code to Linux and Macintosh
  7. Write networking libraries, custom protocols and high-speed multithreaded servers
  8. Use classical frameworks like Indy, Internet Direct and other out of box solutions
  9. Write advanced parsers for source-code and any complex text-format
  10. Write a real-life compiler and IDE (www.smartmobilestudio.com and quartexpascal.wordpress.com)
  11. Create a graphics library from scratch, supporting different pixel formats
  12. Work with large files, performing complex operations on gigabytes and terabytes
  13. Write mobile and desktop applications using Smart Mobile Studio and FireMonkey
  14. Create your server environment, architect your RPC (remote procedure call) service infrastructure
  15. Implement custom services through nodeJS, Smart Mobile Studio, Delphi and FreePascal
  16. Use Remobjects product line: such as RemObjects SDK, Hydra and Data-abstract
  17. Link with C and C++ object files
  18. Use project management tools, scrum, SVN, github and package management
  19. Write, export and import DLL files; both written in and imported to Delphi
  20. Write solid architectures using moderns technologies: xsl schemas, stylesheet validation, rest, node services

.. and much, much more!

Note: While you may not have a need for all of these qualifications (or any one of them), the understanding which the listed topics demand that I master is of great value to you as an employer. The more knowledge and hands-on experience a programmer can offer, the better qualified he is to deal with real-life problems.

Tools of the trade

All programmers have a toolbox with frameworks, code collections or third party components they use. These are the products I always try to use in projects. I have a huge library of components I have bought over the years, and I have no problem getting to know whatever toolkit you prefer. Below is a short list of my favorite tools, things I always carry with me to work.

  • RemObjects SDK to isolate business logic in RPC servers, this allows us to use the same back-end for both desktop, mobile and html5 javascript.
  • RemObject Data Abstract
  • Remobjects Hydra
  • ElevateDB and DBIsam (DBIsam for Delphi 7-2010, ElevateDB for the rest)
  • Developer Express component suite
  • TMS Aurelius database framework
  • MyDac – MySQL data access components
  • MSDac – Microsoft SQL server data access components
  • Raize component suite
  • FreePascal and Lazarus for OS X and Linux services
  • Mono/C# when customers demands¬†it (or Visual Studio)
  • Firemonkey when that is in demand
  • Smart Mobile Studio for mobile applications, cloud services [nodejs] and HTML5/Web

Tools and utilities

  • Beyond compare (compare versions of the same file)
  • Total commander (excellent search inside files in a directory, recursive)
  • DBGen – A program written by me for turning Interbase/firebird databases into classes. Essentially my twist on Aurelius from TMS, except here you generate fixed classes in XML Data Mapping style.

Note: For large projects which needs customization I tend to write tools and converters myself. The DBgen tool was created when I worked at a company which used Firebird. They had two people who hand wrote wrapper classes for nearly 100 tables (!) They had used almost two months and covered about half. I wrote the generator in two days and it generates perfect wrappers for all tables in less than 10 minutes.

Safety first

As a father of three I am only interested in stable, long-term employments. Quick projects (unless we are talking six figures) are of no interest, nor do I want to work for startup groups or kickstarters. I am quite frankly not in a position to take risks; children comes first.

many of my products are available for purchase online

Many of my products are available for purchase online

I am used to working from my own office. I was initially skeptical to that, but it has turned out to work brilliantly. I am always available (within working hours Norwegian time) on Skype, Google, Messenger and phone.

Languages work format

For the past year I have been working full time with C# under Microsoft Visual Studio. Before that I used Mono for about four months, which I must admit I prefer over Visual Studio, and wrote a few iPhone and iPad apps. So I am not a complete stranger to C# or other popular languages. But as I have underlined in my posts on countless occations Рobject pascal is a better language. It is truly sad that Borland was brought down all those years ago, because with them they dragged what must be one of the most profound and influential languages of our time.

That said, I have no scruples about mixing and matching languages. Be it Remobjects Oxygene, C#, C++ or Object Pascal. I also do some mocking in BlitzBasic which is without question the fastest (execution wise) of the bunch, but not really suitable for enterprise level applications.

The point here being that I am versatile, prepared to learn new things and not afraid to ask. It would be a miracle if a man could walk through life without learning something new, so it’s important to show¬†respect and be remain humble. Different languages represents different approaches to problem solving, and that is an important aspect of what it means to be a programmer.

The only thing I ask is that I am properly informed regarding¬†the language you expect me to deliver in –¬†and that I receive the required training or courseware in due time. Do not expect super-hero status in C#, I am absolutely above average, but¬†it cannot be compared to my efficiency and excellence using Delphi.

Contact

If you are interested in hiring me for your company, then don’t hesitate to contact me for a resume. You may e-mail me at lennart.aasenden AT gmail dot com.

Why Amiga is cooler than .net and the perfect embedded environment

May 6, 2015 7 comments

I know, you probably think I’m just another fanboy trying to convince you to use a 30 year old, 16 bit operative-system. Or perhaps I’m just a lunatic with fond memories of my childhood gaming experiences.

No, I’m actually talking serious stuff here.¬†Ok, here it goes:

Unless you have been living under a rock for the past 10 years, computing is heading full stream ahead into virtualization. Companies like VMWare have been doing this for well over a decade now, and before that there were thin-clients and multi-account Windows installations that was all the rage (or Linux, Or OS X for that matter).

Point is: companies are tired of getting boxed into a particular piece of hardware or processor.

MorphOS is an Amiga clone, but it's bound to outdated PPC hardware

MorphOS is an Amiga clone, but it’s bound to outdated PPC hardware

The solution is two-fold: You can go the way of VMWare and emulate a whole machine, from the bios to interrupts to USB triggers. An absolutely astounding piece of engineering if you ask me.

The other route is what Microsoft have been doing, namely to ensure that their programming tools are platform abstracted. The .net framework is more than just a framework or coding standard. It’s a virtual assembly language for a virtual processor which you implement in software. What happens is that the first time you run those bytecode assembly instructions, the JIT compiler assembles them into real, processor bound OP-Codes which run natively.

Have you heard anything like that before?

Amiga, really?

Now switch over to UAE and do a comparison: The UAE platform is, give or take it’s dependence on the host – a complete self-sustaining virtual hardware environment. It’s designed in much the same way as the .net HAL (hardware abstraction layer) is, and it has the benefit of running on a tiny kernel.

So while people may feel that Amiga-OS is long gone and dead, it’s actually now coming back as a perfect virtual environment. Why perfect? Because Windows 10 is still 800 megabytes (embedded preview) while AmigaOS 3.9 or 4.0 + linux kernel + bootstrap is less than 50 megabytes (!) And that’s the full OS with all bells and whistles.

A quick visit to AmiWeb to grab your assembler, your blitzbasic and your C++ compiler and you have a pretty awesome portable, virtual platform running on cheap x86 parts.

I sincerely hope there is a good coder out there that can revive Amithlon as an open-source system.

What the hell is Amithlon?

AmigaOS running on x86 without Windows or a dedicated host OS

AmigaOS running on x86 without Windows or a dedicated host OS

Amithlon is a tiny version of Linux which is designed to boot only one thing. In fact it’s just the bare bones of Linux, no applications, no X server or anything like that (just drivers, bash and rendering via the framebuffer). So it has just the ability to boot into a very small and compact setup. The boot-sequence¬†does one thing and one thing only: boot into a custom-built Amiga Emulator. A JIT based environment which treats mc68k assembler as (drumroll) bytecodes (!)

Amithlon allowed you to install Amiga OS 3.9 on ordinary x86 PC hardware. So finally you could take advantage of cheap and powerful x86 parts and run AmigaOS at high speed using a custom JIT engine.

Where is Amithlon now?

For some insane reason Haage & Partner, which seem to have shipped this system a few years back, just shelved it. I don’t know why so many stupid decisions seem to haunt the Amiga, but this one really is the frosting on the cake. A fully working system which allowed Amiga to run fully abstracted on cheap x86 hardware! And they fu**** shelved it!

I would really like to see that system go open-source. Who has the code? Where did it go? It worked brilliantly yet H&P shut it down — why?

If you have the source, please please give me a PM and I’ll compile it, setup a SVN repository and GIT node and generally make it available to the masses.

Smart Mobile Studio, customize your new IDE

March 27, 2015 Leave a comment

The smart mobile studio designer has never really been any good. We wanted to use a commercial designer at one point, but due to technical limitations in the components we were unable to do so. In short, having more than one instance of a designer in the application caused some serious conflicts and instability.

Instead we decided to stick with our own variation, most of us never really used the designer because are used to FreePascal, C# and library coding under Delphi. The designer for iOS in XCode on the mac is hopeless, we we tend to create even the design in our code. So this decision¬†didn’t have such a hard impact on us. Sadly that was not the case for our users who wanted to design HTML5 interfaces without the clutter of Dreamweaver or dullness of Web-Matrix.

So I can only say I’m sorry we had to do this, but the code generation and support for hardware was more important at the time.

But times change, and right now it’s the tools that needs attention!¬†All in all you will be happy to learn that in the next update which is just around the corner, the designer and all it’s features has received a lot of love!

The Delphi style designer in action

The Delphi style designer in action

First, don’t let my insanely blue¬†color theme in the picture above¬†bother you, I was just playing around with the palette. You are free to either disable theme support or pick another which suits your taste¬†better. The same goes for the text-editor colors. The IDE now ships with several preset-files, and you can create your own palette which can be saved, loaded and made default with a single click.

But back to the designer, what is really exciting is that we actually have 3 different designer modes (!) These are:

  • iOS
  • Delphi
  • Wireframe

In the picture above we are using the Delphi inspired visual designer. It’s a bit dull but I must admit that for me this is the most familiar to use; and holy cow is it fast compared to our previous layout control! Christian has really outdone himself this time and created a designer we can all be proud of.

The first designer was my concoction, I never got to finish it and while it worked quite well with ordinary components, it was ill suited for threaded rendering which is required by the webkit engine. So Christian pretty much had to throw everything out and write it from scratch using Graphics32 (which he has helped create).

For those interested, take a second to contemplate just what takes place inside our designer at any given point. Because it’s not “simple” X/Y coding:

As you create your layout we actually build a small program of your layout, where all the components are created and positioned out. This tiny program is then compiled in a separate thread, and ultimately rendered by the WebKit engine to an off-screen pixmap. We then talk to JavaScript directly from Delphi and enum the position (including rotated and/or scaled offsets) for each control -before we grab the control pixels and cache them in buffers associated with each designer-element (phew!).

In short, this is how you can move and render HTML5 controls around inside a Delphi/FPC program just like they were ordinary VCL/FCL controls. Pretty neat right?

Either way, whatever our problems with the designer were in the past, that is now ancient history! The new form-designer is silky smooth, pixel perfect, easy to use and very, very fast!

Themes, palette and your taste in bling

Another really fantastic collection of features is how the new IDE deals with colors. We all love them and we all have different perception of what good UI design is like.

Well finally we have added the same amount of (if not more) options as Microsoft Visual Studio —¬†from being able to customize the code editor, the gutter section and the mini-map; to applying¬†a full theme to the IDE in general. Below is a picture of the IDE using a different theme from the image above.

Personalize the IDE just the way you want it

Personalize the IDE just the way you want it

My personal favorites are “Amethyst Kamri “and and “Smokey Quartz Kamri”. Some love them and others hate them. But at least you have options to change the general look and feel of the IDE! And if you hate skinning and tweaking colors, just pick the default Windows theme and load in a bog-standard Lazarus/Delphi editor preset.

When we combine all this tweaking freedom with the completely overhauled and completely dockable IDE sections, you can pretty much personalize around 90% of the application.

A forgotten feature

The class browser was a great idea borrowed from the C# Mono project, but sadly we had to down-prioritize it earlier on – and it has remained in a sort of “limbo state”¬†since version 1.2.

In the next update you will discover that we decided to give that old gem some love too and it really shows! Indexing is faster than ever, and searching for classes and methods is finally, after what seem like ages Рefficient and useful!

The class browser was never created to be documentation. If you have some experience with C# or C++ through the mono project, you will know that a class-browser is just for searching up classnames/methods without having to change file or CTRL-Click out of your current unit.

C# programmers use this like mad, and assigns it to a clever key-combo. The .net framework is huge so remembering everything is impossible. Also, when focusing on a difficult task it’s not always productive having to scroll to the top of the unit, CTRL+Click on another unit, then CTRL+F to search (and so on). If all you want is the name of that class or a member –thats what the class browser is for. Nothing more, nothing less.

Finally it's a breeze to hunt down a class to inspect it's members

Finally it’s a breeze to hunt down a class to inspect it’s members

More to come

Over the past 2 weeks I have availed more and more features. And we are not done yet. There have been plenty of fixes and advancements in the compiler, linker and how applications are put together. So you have plenty to look forward to!

I have already mentioned quite a bit about the RTL changes, but not all — so stay tuned for more insight into what is without a doubt the best release of Smart Mobile Studio since 1.2 (!)