Archive
Moving to Linux, why and how – Part 1
Linux. You either love it or hate it. And if you are like me, with a background on either Amiga or Mac, only to “grow up” and get a Windows PC for college or work – then chances are you have already tried several Linux variations over the years, but always ended up going back to Microsoft Windows.
My reason to use Windows is twofold: Working as a programmer for many years I have gotten to know Windows from the inside out. My primary programming language, Object Pascal, is strongly represented on Windows in the form of Embarcadero Delphi (previously Borland) – so simply dropping Windows over night was never an option. Dropping Windows would in effect mean putting myself out of work. So that’s not gonna happen.
Also, I don’t carry any ill will towards Windows. It has put food on my table all my adult life, so why should I hate that? I use Windows on a daily basis – but in my line of work it happens primarily through VMWare, not on an actual Windows machine. Hence I don’t want to throw huge amounts of cash out the window (pun intended) for an installation I quite frankly don’t use or need. As long as VMWare or any other virtualization tool is at hand, I can do my work.
Secondly, learning a new operative system takes time. It depends of course; on how much you need to know about something before you are comfortable using it. If you know every inch of Windows, and rate yourself an expert in all things Microsoft – then odds are you will resist going back to “newbie” or “lamer” status. People tend to stick with what they know, precisely to avoid feeling like a beginner.
If that seems about right, then keep on reading.
Mac’s are doing it
If you ask the average Mac user where the preferences files for a service is stored on his system, I doubt you will get an intelligent answer. The emphasis being “user”, not programmer in this context. The world of Apple is a purely consumer oriented enterprise. Users pay more money for a Mac exactly to avoid having to be an expert (!) In fact that’s one of the major selling points Apple got, that you can be a complete novice and still operate highly advanced technology.
A lot of very productive Apple users have absolutely no idea how Unix works. And Apple is doing their best to obscure the fact that they are selling Unix to the masses; Hiding the underlying filesystem, renaming important system files depending on the locale (localization of common system names) and much, much more. A Mac was never designed purely for technicians, hackers (in the original, kind sense), and under no circumstance for people who “think different”.
Be that as it may — the point of mentioning Mac’s was to demonstrate that you don’t have to be an expert in all things Linux in order to enjoy it. So what if you don’t know how to invoke some low-level kernel function (because calling kernel32.dll under Windows is what you do daily right?). If Linux has a program you enjoy using, which is helpful and makes you productive – why not make use of it? Why should we pay Microsoft $250-$400 for a system which, to be frank, you can’t trust. Windows Vista was a complete waste of money, only saved in the 11’th hour by Windows 7 (which made PC’s usable again). But as the smoke settled Microsoft went ahead and screwed up Windows 8 with “tiles”, trying to be something they are not. Leave iOS to Apple and stick to what Windows does best: the Windows desktop and Start button.+
Linux in 2014
If you had asked me some 10 years ago about Linux, I would probably have urged you to stick with Windows. My own experience was rather dim: always missing drivers. No games, only freeware crap. No Delphi, no Visual Studio – and you had to be an expert to uninstall a simple program. And should your disk get read/write errors — you were helpless unless you knew a Linux expert. That’s not a pretty picture. No wonder people have avoided Linux and stuck to Microsoft.
Today however, Linux is a very different system than it was back in 2004. Ubuntu specifically has financial backing, and the owner of the company have spent millions paying developers to write drivers and “fix” the package system. Installing a program under Linux has always been easy – but getting rid of it later (especially if something goes wrong) was a nightmare. For Ubuntu at least that is a tale from the past, and what you face now is — in lack of a better phrase, is a polished experience very close to Apple’s OS X. With one crucial difference: The majority of applications on “app store” are free – and installing them is done with a single click.
Virtualization
For the past 8 years most of my development has been done in VMWare, even on Windows. I own several Delphi version and have isolated each development environment in its own virtual-machine, complete with various versions of Windows. That way, I can create and test my programs on all versions of Windows from XP through Windows 8, using Delphi 7 through XE5. This setup, although time-consuming to make, have saved my bacon more than once. Especially when it comes to testing and bug-fixing (!) It also means that I can bring my machines with me on an external disk – no matter where I work. Many of my clients insist that I work “in office”, and spending days setting up their PC to match my development needs is no longer a problem. I just install VMWare player – and I’m ready to code.
The benefit of all this is that the underlying operative system doesn’t matter as much as it once did. As long as I can install VMWare – I can use Windows, OS X, Linux and even Spark Solaris for that matter (!). For me personally it means I save quite a bit of cash, because I don’t need to pay for more Windows Licensenses. I own the licenses for my Virtual Machines — and that’s it.
As a “Delphi” programmer I can also enjoy FPC/Lazarus natively on Linux, and generate binaries for Windows, OS X and Linux from the same IDE. Perhaps one day Embarcadero will offer Delphi for Linux and Mac, but until then I don’t have to pay thousands just to do my job (!).
When it comes to everyday tasks Linux also suits my needs. There are a ton of Open Office forks out there (libreOffice is very polished), email clients, browsers, music players, dvd/movie players to choose from. And should I feel the urge to play games then I can just fire up Steam – or Windows in VMWare and play whatever I love. I’m more of a retro gamer so I enjoy MAME, Scumm VM and Amiga Emulator’s more than I do modern games (I own an XBox and PSX1-3 so I don’t suffer in that department).
Getting started on Linux
If you fancy getting started with Linux as a Delphi programmer then this article series will be for you. I will go through everything: getting your Delphi installation into VMWare, installing Ubuntu on your machine – and installing VMWare on your Ubuntu installation. We are also going to dig into Lazarus and various other development tools, not to mention getting Smart Mobile Studio to run under Linux so you can continue to write cutting edge HTML5/JS applications.
In the meantime, download the latest Ubuntu distro and burn it to a DVD/CD Rom. Spend a few minutes making sure you have backed up everything on an external drive (Fat32 or NTFS formated) so you are absolutely sure nothing can be damaged if you do something wrong. Dropbox is a neat place to store your source files 🙂
Object Pascal, power computing at extreme budgets
Being a programmer in 2014 –having been a coder through the 90’s and 2k’s until present day – is like living in disney land. When I was a teenager my most priced possession was my Amiga computer. It cost a fortune and was powered by a whopping 1Mhz MC68000 CPU (yet due to it’s custom chipset, it outperformed PC’s up until GPU powered 3d cards became standard around 1995). The Amiga was the bomb back then. And it’s memory capacity was jaw dropping 2 megabyte on-board chip-ram. I extended it with 8 megabyte additional ram, which set me back around $400, which for a poor student was a small fortune.
What can you get for $400 today? Well today you can pick up a second-hand Apple Mac G5 Dual-Core PPC for less than $200, complete with a decent monitor. And if you know how to use bit-torrent, you can pimp that machine so full of software that it has more value than a new $1995 iMac. Photoshop, pro-tools and gcc / x-code was awesome in the PPC as well, not just the Intel Mac’s.
Object Pascal on older hardware
Lately I find myself thinking that it has to be possible to re-cycle some of these old machines and apply them to new and modern purposes. I mean, a G5 PPC mac is a processing beast compared to it’s contemporary PCs 8 or 10 years ago. The G5 dual core processor was the final evolution of the PPC range of processors – so it’s a fine and powerful piece of engineering for a ridicules price. Making use of these machines with object pascal sounds both fun and interesting.
With this in mind I did a quick search on Finn.no, which has a second-hand market; comparable to craigslist, ebay and sites of that genre. And the average price for perfectly usable, good condition second hand Apple G4 and G5 Macs were in the 500-100 NKR price range. Which is roughly between $100-$200! That is a lot of CPU power ladies and gentlemen, in fact, it’s almost sad to see these machines which look more like works of fine art than computers being practically thrown out the door for the price of a playstation game.
So what? I hear you say. No modern software will run on these machines – so they are useless. It’s just a heap of unusable iron taking up space.
Well, not quite. Linux happily runs on PPC hardware – some users argue that Linux runs better on these machines than Mac OS classic and OSX did to begin with. But if you are adventurous and able to mentally disregard the OS factor (for now) and would like to use Object Pascal for specific work related tasks, then you can include an alternative operative system called MorphOS.
Now before we continue, let’s look at a couple of tasks where old hardware can be recycled and which supports object pascal (freepascal and/or lazarus in a desktop environment). And lest we forget, there is a fork of lazarus called Code Typhoon which is rarely talked about yet enjoy a steady following around the world due to it’s stability and rich component base. And it’s free (!). But first, let’s look at some tasks suitable for re-cycled hardware:
Web server
The most common task for older hardware is, naturally, to be used as a vanilla web-server. Apache (the webserver for linux and other alternative operating systems) is for the most part hardware agnostic –and as long as the latest berkley tcp/ip stack is installed, compiling Apache from source is easy and hassle free (3 lines from the command prompt under linux/unix).
If you add NodeJS to the mix then you have yourself a paradise for Smart Mobile Studio development, since Smart Pascal allows you to write both client and server from the same codebase, running the server code under nodeJS and client in any HTML5 compliant browser. So building your own client/server environment for testing purposes at home for less than $200 is more than possible.
Note: nodeJS may not run on older versions of OS X, but it will almost certainly run if you install linux.
Backup server
Another form of server which is a must these days both at the office and in your home, is a dedicated backup server. A lot of people fork out $400-$600 for a network disk or NAS server (both for backup and movie streaming), but that’s actually a complete waste of money; Because you can as I mentioned, pick up a second-hand PPC mac for 1/6 of the price which is 50 times faster and with plenty of room for IDE disks (not to mention remote desktop options so you can control it from your work PC or Mac). OS X also have functions setting up a machine as a backup device for other computers – and a fileshare on the network for movies and music is a matter of flicking a switch.
Education
If you work as a teacher, pupils dont really need access to the very best. In fact, learning to program in Lazarus on a second hand PC or Mac (the latter recommended) running linux or OSX is a fantastic way to broaden the pupils horizons. Delphi has become extremely large and heavy in terms of technology. Beginners without proper documentation can quite frankly get lost in Delphi XE 1 through 7. So starting with Lazarus and freepascal, which is a delight to use on Fedora Linux, is an excellent start!
A good object pascal programmer could make a network program for tests and exams, which makes sense for schools on a budget. You dont have to fork out thousands of dollars or pounds for a uniform computing environment + software.
Store front-end
If you work in technical retail your store-front is bound to run presentations, video and/or demonstration slideshows. It can cost as much as $2000 to buy a professional multimedia studio, adapter packages and cables, not to mention database integration for daily updates of prices and offers. With the help of Freepascal and a dedicated machine, an old iMac G5 is more than powerful enough to handle a full shop front-end, multiple monitors (chained even if you like) and/or database presentations. Graphics32 (which has been ported to Lazarus/FPC) makes effect programming extremely easy – and you can throw in openGL if that tickles your fancy.
Paint mixing and customer kiosk systems
Paint is one of those items which everyone uses, but we rarely think about it until we need it. Most stores that ship and sell paint have digital mixing programs where the customer can select and adjust (create) their own blend of paint. It’s actually a very easy application, at least those that i’ve seen, which I would complete in roughly 2 days work. The only time-consuming task for such a project, is coding the serial-cable protocol for transferring the RGB color values to the mixer. Again — old hardware is up for the task. A PPC G4/G5 is ample power for running a fullscreen, mouse driven, object pascal application — and Freepascal is very well evolved so you will find everything you need in the RTL.
Media Server / Center
A $30 Raspberry PI mini computer is more than enough to power the latest Linux media station (or server) software. Since Linux is extremely popular you will also find the latest versions of Freepascal and Lazarus in most distro’s (including Raspberry PI’s repo). If you don’t want to fork out for an Apple-TV or Google TV stick, then you can easily build one yourself with a raspberry pi.
Build server (SVN)
While perhaps a bit overkill for the lone hobby programmer, it can be a valuable exercise for professional and amateurs alike. I personally have my personal SVN server running and use that in combination with a backup server to keep my 15+ years of code safe and up to date. Using an old G5 or G4 to maintain your company source-code (if you represent a team of 2-10 programmers) is not just good practice – it’s a required minimum.
And if you are thinking, how can I do nightly builds on a PPC machine? Well, if you use freepascal then that wont be a problem (multi target compiler). But if you use Delphi you may want to run Windows under Bochs (a bare-bone windows XP is enough if all you want to do is compile):
- Bochs for OS X (PPC): http://www.macupdate.com/app/mac/11177/bochs
Selecting hardware
If you can get your hands on a second-hand Apple iMac G5 workstation (PPC processor), which in Norway at least can be picked up for around 1000 NKR (US $200) that is an absolutely fantastic machine. It is also “modern” enough to run a good selection of alternative operative systems (alternative to Windows and OS X) as well as OS X up to version 4.5 if memory serves me right. This is a perfect machine to recycle for new tasks.
Apple G4 machines, which I find esthetically pleasing and fancy (and easy to fix, replace parts and code on) is also a good find. But you should make sure you check the hardware against the MorphOS hardware compatibility list — which is also a good list for Linux (to check if your old machine can be used with modern Linux distros).
Older PC’s is likewise perfect for recycling, but once again you should check for driver support (which is always the problem with PC/Win machines, as opposed to Mac’s which have a fixed chipset). I would not buy a PC older than 7 years, and would probably pay very little for such a machine ($40/50). If you buy or re-cycle an older PC, make sure you have at least 4 gig of ram (the max for WinXP unmodified) if you plan to run Windows, same goes for Linux.
At the very lowest end of the spectrum, but surprisingly the most fun to play with – are embedded micro computers such as the Raspberry PI. Starting at $30 it comes with a quite powerful GPU, making it ideal for homebrew media center projects. It supports Linux and as such it has full access to Object Pascal/ freepascal. But, due to the very small processor, Lazarus is a bit to slow for serious work — but FPC/Lazarus executables run very fast and is in my view the best language to use; side by side with C++. If you combine FPC with SDL (simple direct media layer) you have a pretty modern multimedia engine to play with, regardless of CPU and architecture.
What about the web
The downside of working with older hardware is that you can only use them for development. The moment you want to ship a product written in platform independent object pascal you have to get your hands on a machine from the modern marketplace. But there is one combination where you can avoid all that – and that if if your target media is HTML5 exclusively.
While Delphi XE 1 through 7 is far to processor and memory hungry for an older PC, Smart Mobile Studio is absolutely perfect. It comes with a small and compact RTL for making cutting edge HTML5 mobile apps. It has a chrome browser built-in (embedded) and is more or less everything you need to write JavaScript based applications designed for either web-pages (embedded like a flash app would) or a fullscreen mobile app. You write object pascal, the compiler generates hardcore JavaScript from that.
And JavaScript is extremely fast, in many cases (especially when it comes to graphics) faster than native Delphi (which sounds ridicules I know, but check the benchmarks and see for yourself). JavaScript also has the benefit of running pretty much anywhere in a modern browser.
So one very cheap alternative is a 5+ year old PC with Windows XP setup just for Smart Mobile Studio development. I actually have several such machines setup, both real hardware and virtual machines (VMWare).
- V8 JavaScript engine for PPC (OSX): https://github.com/andrewlow/v8ppc
- NodeJS for PPC (OSX): https://github.com/andrewlow/node
- Bochs X86 Emulator for PPC MAC: http://www.macupdate.com/app/mac/11177/bochs
Final verdict
Is it possible to build your own fantastic object-pascal super computers on an extreme budget? 10 years ago the answer would have been a loud “No!”, but today the reality is that you can buy extreme amounts of processing power second-hand for next to nothing. And you can make use of FreePascal and Lazarus to build custom systems – systems which can be re-compiled on more modern hardware when needed. This makes for some very interesting cross-platform solutions.
I should also mention that a lot of virtual machines, like Bochs (free) run perfectly fine under PPC hardware, meaning that you can in fact setup a test-environment for your Delphi and/or freepascal projects on an older Mac – and just remote desktop your way into the test environment whenever you want.
And one scenario I forgot: Your own SVN server is also a good use of old hardware.
Well, I hope you have found some inspiration to re-cycle technology and put your object pascal knowledge to new and exciting uses. Who knows, perhaps you come up with a good idea and can ship out 100 used macs preloaded with your software?
Nothing is impossible 🙂
Assasin creed hoodies, scam?
Updated 30.12.2015: The website has now simply changed their domain name to leogary (http://www.leogary.com/) but be aware: Its the same scam and the same company!
A few weeks back I came across this website: www.assassinshoodies.com, which sells some alternative and cool looking hoodies, sweaters and jackets based on the in-game design from Assassin’s Creed. So I ordered two pieces of size L hoodies for my 12-year-old son. Since sizes differs across borders I would have to order an XL for myself if I wanted one — but my son loves the game and he was so looking forward to getting “real” assassin creed hoodies so I went ahead and ordered.
The odds of missing the size, sending a completely different low-quality brand – and missing the postal code is in my view extremely low. It’s like ordering Italian Pizza and getting a kebab delivered next door from a place you never called.
As you can see from the picture below, the hoodies doesn’t actually have a hood. They have a high neck with supported brim, which is perfect for Norwegian weather in the fall / autumn, add a small scarf inside the neck and you can safely go jogging or send your kid to soccer practice in cold weather.
What did we get?
First of all they managed to send us 2 pieces of hoodies size XXL, meaning that they dont even fit a grown man, let alone my 12 year old son! Secondly they sent it to the wrong postal-code, so I had to drive 40 minutes to pick them up.
But last and perhaps more importantly — they sent us two ordinary hoodies of a completely different brand (!)
The hoodies are plain old sweater-hoodies, low-quality fabric, marked “MIAO DU”, which after googling it seems to be a production company creating just about everything under the sun in China.
The odds of missing the size, sending a completely different low-quality brand – and missing the postal code is in my view extremely low. It’s like ordering Italian Pizza and getting a kebab delivered next door from a place you never called.
So while the company I ordered from is American, their production and storage is kept in China. How many people do you think bother to return a $80 order which took 5-6 weeks from China — back to China after being ripped off? Might as well throw the money out the window, twice!
So I urge everyone to not order a single piece from http://www.assassinshoodies.com. I was under the impression I was buying from an American company which could be trusted, but in fact everything seems to take place in China. And I don’t order things from China due to past experiences with false information, scams and an utter lack of respect for customers.
Be warned — stay away!
Castalia parser, how to use
Everyone who has ever wanted to create their own scripting engine, or indeed – their own compiler, have checked out the free and open-source Delphi parser called Castalia. Written by Jacob Thurman, the Castalia parser is also a vital part of the Castalia Delphi IDE extension. If you havent had a look at Castalia before then head over to JT’s website and have a look.
To sum up the situation: Castalia is a commercial product, but the Castalia Parser is open-source and can be used, improved and downloaded by anyone. In fact, the Castalia parser is hosted on GitHub here: https://github.com/jacobthurman/Castalia-Delphi-Parser
So why isn’t it used in more projects? Well, because it’s hardly documented. And examples are thin on the ground to say the least. But perhaps more than anything else — parsing programming source-code is not within the range of “normal” programming. There are a few good books on compiler and parsing technology out there, but in general this is something you either learn through advanced computing classes at some university (or just figuring it out by spending time on the subject). So it takes a bit of thinking before you jump in.
So how do we use it?
Simple, but before we dig into the how you have to understand why. Without understanding why a parser is designed the way it is – you wont really be able to use it properly. So here are some simple factoids to teach you just that:
- A parser reads a file character by character, not line by line like humans do. To a parser, CR(carriage return, #13) and LF (linefeed, #10) has no meaning at all, they are just characters like any other. It’s up to the parser to treat these are line-end identifiers.
- A human readable word, like “begin” or “procedure” is simply “an array of chars” with no meaning. The meaning is once again provided by our code.
- Connected to the parser is something called a “tokenizer”, which is a class meant to translate and recognize words and characters, turning them into tokens – which we can work with in our programming.
So characters like “(” becomes “OpenRound”, and “;” becomes “semicolon”. A tokenizer is the computer version of what human users recognize as the language being parsed.
A tokenizer is what makes it possible for us to write criteria and expectation code. Expectations are the criteria the parser holds for a piece of code, these are called “future” elements, while tokens who are already read are called past criteria.
Take something simple, like parsing the following:
Procedure TMyClass.Myproc(aValue:Integer); begin end;
The code for dealing with the keyword “procedure” will have expectations to what comes next. Namely that it’s a valid procedure name. It can also have criteria to the past, for instance – is it defined inside a class? If that is the case, then the class-name should be omitted and no begin/end section is allowed (that should be below the implementation marker).
Moral? Parsing is just as nitty-gritty as you imagine it to be!
With that in mind, have look at the Castalia parser source-code. You will notice that each language word or feature is mapped directly to a class-member. What you are supposed to do here is to override those that you need (a full engine would override them all) and use the information to build a program-model in memory (a program would be a list of units, who contains a list of classes and functions, constants and variables etc).
What you must take height for, is that elements are not created complete, but one piece at a time. A class declaration for instance, should be registered in your model when the class method triggers (you have to check that it’s not just a forward declaration ofcourse), while the name of the class is delivered afterwards as the parser get’s to it. So you need to write you parser like this:
type TMyParser = Class(TSimpleParParse) private FCurrentClass: TMyClassDef; FCurrentMethod: TMyClassMethod; end;
So whenever the class is invoked and it should be registered, you register the object without a name and assign it to FCurrentClass. Whenever a class method is triggered, then it should apply to the current class and also be assigned to FCurrentmethod so setting the name is easier. And so fourth. This applies more or less to all the constructs of the language which Castalia covers and deals with nicely!
You could, perhaps, build a simplification object on top of castalia, which triggers events like “OnRegisterClass(Sender:TObject; Ancestor:String; aName:String)” or something like it, but Castalia really is extremely well made and simple to work with once you understand how it works.
Language rules
But now you probably wonder — what about rules? If the parser just blindly recognize words and triggers that procedure in the Castalia parser class — what will protect us from translating gibberish? I mean, would “begin procedure function” actually compile?
No. The rules of “object pascal” is maintained by two things:
- Expectations to future words (read: symbols)
- Expectations on past words
For instance, a method dealing with “procedure-names” would expect the past word to be “procedure”, and it would also have expectations on future symbols to be either OpenRound (followed by a list of parameters) or just semicolon.
procedure SOMENAME( params ); procedure SOMENAME;
If the expectations on future symbols (being either “(” or “;”) is not met, then you know something is wrong and it’s a syntax error. And it’s the same with expectations on past symbols. If the past word is not “procedure” or “function”, then once again we know that something is wrong and throw a syntax-error exception. If the past symbol is “function”, we would also expect “:” and “datatype” at the end of the function declaration. So parsing a procedure is a bit different from procedures. It also defines if the built-in variable “result” is allowed in the content code.
Either way — When you combine the complete set of expectations for each keyword — the end result is what we call object pascal, and it defines what you are allowed to do in every aspect of the language. You dont need a huge kit of code to test for everything under the sun first, you test the validity of the program one symbol and token at a time.
To read “future” tokens you use the following:
Lexer.InitAhead; FNextToken:=Lexer.AheadTokenID; //What is the future token?
The extended ID of the “present” position can be read using the TokenID and ExID properties. There really is no need to document these features here, since they are used everywhere throughout the source code. You will also find how to use them in the actual methods you want to override. And last but not least, expectations are handled by “Expected(ptSemiColon);” on entry (replace token by what you expect).
What typically confuses people when playing with Castalia, is the utter lack of an object model. Which is not the job of a parser/lexer/tokenizer at all.
Object model, the missing piece
What Castalia is, is a very effective hand-written parser for Delphi. The author(s) really should be praised for this – and it’s very sad that it havent been deployed in more projects over the past 5-6 years. It’s like a gem that has been lying there without being noticed like forever, waiting for someone to pick it up and make use of it. But it’s missing the piece which naturally is what people want, and that is an object-model for a unit of program.
Such an object model is typically called AST, which is short for “abstract symbol tree”. How you implement and organize this piece of code is different from language to language and compiler to compiler, but a few common denominators are to be found across the board.
The AST is basically what the .NET “intermediate language” compiles from. So the first pass of the .net compiler breaks the source-code into an AST, the second pass takes the AST and compiles it into CIL code (virtual assembly language) – which at it’s final stage can be translated into real machine code.
If you think this is easy work – think again, because finding a model that is flexible enough to work AND be maintained over time, is extremely hard. In fact, it doesn’t get much harder in life than writing compilers (with the exception of coding a kernel or OS perhaps).
Take something simple, like optimizing the generated code, let’s say we have a procedure which looks like this:
procedure TMyObject.SomeProc; var x: Integer; Begin for x:=0 to 255 do inc(y); end;
A silly example I know, but could this be optimized? Yes indeed. By doing loop expansion we could handle 8 calls at once or 16, that is a typical optimization to better performance. It has the benefit of being simple to automate and make work without side effects (if the conditions are right):
procedure TMyObject.SomeProc; var x,y: Integer; Begin for x:=0 to 31 do Begin inc(y); inc(y); inc(y); inc(y); inc(y); inc(y); inc(y); inc(y); end; end;
But in order to do this automatically, the AST have to well enough built. Remember, what we are dealing with is an abstract symbol tree. So in reality it looks like this:
- [TQTXClassMethod]
- [TQTXClassMethodLocalVariableList]
- Items
- X
- Datatype: TQTXDatatype.dsInteger32
- Name: “X”
- InitialValue: [QTXConstantValue] 0
- X
- Items
- FOR
- Target: [QTXClassMemberLocalVar]
- Start: [QTXConstantValue] 0
- End: [QTXConstantValue] 255
- INC
- Target: [QTXClassMemberLocalVar]
- Value: [QTXConstantValue] 1
- [TQTXClassMethodLocalVariableList]
In this case, the inner expression-block consists of just one “INC” call, so we could perform loop expansion by simply duplicating the expression-block X number of times (8 or 16 being traditional numbers) and get away with it. But once the internal expression-block (the code within the for/next loop) is to complex, that wont work. In fact, it may screw up the entire program.
A second optimization would be to remove the whole for/next block and replace it with a single:
- INC
- Target: [QTXClassMemberLocalVar]
- Value: [QTXConstantValue] 255
That would be the next best optimization. But since we are using inc() on a local variable only to exit the procedure, the procedure should be ultimately eliminated from the compiled code altogether, because it doesn’t alter existing data or produce any output. This is an area of compilation where DWScript is really, really good. It’s a perfect example of a seasoned, evolved and well designed compiler and code-generator. Eric Grange really did a fantastic job on that one.
Why all the references?
If you look at the pseudo-tree above and think “whats with all the references? Why not simply have X where it says X?”, then you probably dont get the big picture yet. There are many different value types a parser/compiler has to deal with. For instance, there are local variables for a procedure, but there are also class-global fields. Some fields are visible only to the member of that class (public or protected) while others are out of sight for decendants. Then there are unit variables which can be accessed only by unit procedures (both classes and non objective procedures) – depending on if they are defined above or below the implementation marker. Properties on the other hand cannot be handled in the same way, since they involve a getter/setter mechanism.
The only way to deal with all these different types from a unified codebase, is to derive them all from a common value class, which implements standard methods for getting and setting values. Hence the number “0” or “255” is not explicitly defined, instead these constant values (constant because they dont change) must be registered and evaluated as they are used.
The original values would be replaced when a code generator uses them, resolved back to their initial representation. A C# codegen would end up with something like this:
for (x=0;x<256;;) { x++; }
Writing a new compiler
Parsers and compiler AST models are extremely complex and time-consuming to build, but they are also the most rewarding intellectual endeavor you can pursue in the world of computing. At least I think so. Making a game or a utility program is ofcourse fun, but nothing beats making a real, life compiler which takes code and “makers something” from it.
I really hope this little article inspire more use of Castalia, both the commercial product – as well as it’s free Delphi parser. There is ample room in the world of Delphi for new and interesting compilers, so give it a whirl!
Chainable effects for Smart Mobile Studio
Just made a small but very effective SVN commit to QTXLibrary. The non-callback variations of the effect methods are now functions, allowing for some interesting chainable effects.
For instance, you can now do this:
FPanel1.fxMoveTo(0.4).fxFadeIn(0.4).fxMoveDown(0.3).fxZoomOut(0.2);
Since all the fx–() methods returns the “self” value, this means you can execute effects like this. And they will execute in sync due to the callback checking (if the effect is busy).
All the power of jQuery / jQGUI without all the hassle of raw JS 🙂
Head over to https://code.google.com/p/qtxlibrary/ and update your SVN repo now!
Smart Mobile Studio goes native
Yes, you read right. I am presently in the process of completing an add-on for Smart Mobile Studio which turns your compiled JavaScript into native win32 and 64 bit applications. Applications that can be distributed, installed and executed just like ordinary programs.
Multimedia and hybrid application development
With the release of secondary compiler, you will be able to produce native 32-bit and 64-bit hybrid applications, meaning that you take full advantage of the browser’s multimedia capabilities – but also reap the benefits of direct access to the operative system; en-par with Embarcadero Delphi and other native programming languages.
- Create native Windows tile apps
- Perfect for games and multimedia
- Full access to the file-system
- Build web and native apps from the same source with no change
- Removes same-origin policy, giving your apps access to cross-domain data
- Take full advantage of web-workers (threading)
Distribution
The compiler collects and binds all your external resources (JavaScript, html, CSS, images and data files) into a single executable file. The application can either start in full-screen mode (user can exit full screen using the CTRL + ESC key) . The application has no dependencies except for the Chromium Embedded DLL libraries.
Users can expect a significant speed boost as rendering threads are executed with elevated priority, making Smart Mobile Studio a perfect game and multimedia engine.
Alternatives
At the moment you can achieve the same result with Apache Cordova (phonegap) and nodeWebkit. The easiest (command-line) to work with is no doubt nodeWebKit.
JavaScript for the future
More and more hardware and software supports or runs JavaScript directly. Anything from micro-controllers to the latest mobile operative system from Samsung, Tizen, opens for full application development using nothing but JavaScript. Microsoft likewise surprised everyone when they picked JavaScript as their primary language for Windows Tiles and desktop widgets. And for those situations which demand native code, you simply package your app using Phonegap to achieve just that.
JavaScript is the language which has seen the most growth for the past 4 years – and the language is getting more and more important for future-proof application development.
Smart Mobile Studio leverages JavaScript and makes programming advanced, object-oriented, large-scale applications both fun and enjoyable. And when I write object-orientation I mean the real deal, not just hyped prototype cloning like other frameworks offer.
For a more in-depth presentation of Smart Mobile Studio click here – or head over to www.smartmobilestudio.com for the full specs.
SMS fonts, once and for all, part 3
Right, in the past two articles (first here and second here) we established two set of routines in the same class. In short, what we have finished is as such:
- The ability to check if a font is installed
- The ability to traverse and find the font used by an html element
- The ability to measure text
Now the last point on that list is extremely tricky. While measuring static, one-liner text (read: text that does not break) is indeed working as expected, perfect for captions and other forms of lengths — more complex segments of html measurement is proving to require more work.
To illustrate what we want to do here:
As you can see from this highly scientific diagram i slapped together in paint (the typos are free by the way), by setting the “display” css property to “scroll” (please see css documentation for the display property here), we basically turn the yellow DIV element into a Delphi like TScrollbox. So what we are effectively looking for is clientwidth/clientheight for a TScrollbox which has content more content than is being displayed.
HTML5 is a bit silly regarding this, in that the scrollWidth and scrollHeight properties, which contains the full size of the blue box (the values we want to get) are only there if we have set the display property right — and that the content actually spans beyond the visible display. If not, scrollWidth or scrollHeight may be nil (undefined) depending on what fits within it. This is why i scale the element to 4×4 pixels before measuring – to make sure both values are kosher.
And now for the problem (or feature, depending on how you regard the collected wisdom of the WC3), we have to write to the style for the element, but read back values from the calculated style. Why? Because the world of browsers are not (believe it or not) an exact science. As discussed ad infinitum by now, the browser can assign several styles to an element (hence the “cascading” word in CSS) which results in a final style simply called “the calculated style”. Which the WC3 in their infinite wisdom have decided to store in a completely separate location (document.topframe as opposed to, oh say, element.calculatedStyle).
To make things even more fun, the browser may in fact end up breaking things up to much when we scale down — so truth be told, we may fall back to clientWidth/clientHeight or offsetWidth/offsetHeight instead. Confused yet? Your welcome.
Fixing it
Now, the problem with the old model was simply that by setting the element to 4×4 pixels in size, I was not in fact measuring the actual content – instead I was measuring the largest word in the string. Which gave some pretty funky visual results to be mild.
I have since updated the routine and I am very happy to report that it now works as expected — and finally my casebook demo auto-calculates the height of a new-item perfectly (at least under webkit). I have yet to do battle with Mozilla and Opera, but at least I suspect that Opera will have it’s ducks in a row.
So here is the updated version. I will omit the word “final” because the browser can still throw some quirks at us – but at least now we know what we are dealing with!
Something awesome
QTXLibrary expands the normal TW3CustomControl with attribute storage. I decided to add two methods for measuring text directly on the level of TW3Customcontrol. What it does is grab the font information for that element directly – so you dont have to worry about those parameters, and just provide the text you want to measure. Used the fixed version if you know the width of the target placeholder. Pretty cool!
TQTXTextMetric = Record tmWidth: Integer; tmHeight: Integer; function toString:String; End; TQTXFontInfo = Record fiName: String; fiSize: Integer; function toString:String; End; TQTXFontDetector = Class(TObject) private FBaseFonts: array of string; FtestString: String = "mmmmmmmmmmlli"; FtestSize: String = '72px'; Fh: THandle; Fs: THandle; FdefaultWidth: Variant; FdefaultHeight: Variant; public function Detect(aFont:String):Boolean; function MeasureText(aFontInfo:TQTXFontInfo; aContent:String):TQTXTextMetric;overload; function MeasureText(aFontInfo:TQTXFontInfo; aFixedWidth:Integer; aContent:String):TQTXTextMetric;overload; function MeasureText(aFontName:String;aFontSize:Integer; aContent:String):TQTXTextMetric;overload; function MeasureText(aFontName:String;aFontSize:Integer; aFixedWidth:Integer; aContent:String):TQTXTextMetric;overload; function getFontInfo(const aHandle:THandle):TQTXFontInfo; Constructor Create;virtual; End; //############################################################################ // TQTXFontInfo //############################################################################ function TQTXFontInfo.toString:String; begin result:=Format('%s %dpx',[fiName,fiSize]); end; //############################################################################ // TQTXFontDetector //############################################################################ Constructor TQTXFontDetector.Create; var x: Integer; begin inherited Create; FBaseFonts.add('monospace'); FBaseFonts.add('sans-serif'); FBaseFonts.add('serif'); Fh:=browserApi.document.body; Fs:=browserApi.document.createElement("span"); Fs.style.fontSize:=FtestSize; Fs.innerHTML := FtestString; FDefaultWidth:=TVariant.createObject; FDefaultHeight:=TVariant.createObject; if FBaseFonts.Count>0 then for x:=FBaseFonts.low to FBaseFonts.high do begin Fs.style.fontFamily := FbaseFonts[x]; Fh.appendChild(Fs); FdefaultWidth[FbaseFonts[x]] := Fs.offsetWidth; FdefaultHeight[FbaseFonts[x]] := Fs.offsetHeight; Fh.removeChild(Fs); end; end; function TQTXFontDetector.getFontInfo(const aHandle:THandle):TQTXFontInfo; var mName: String; mSize: Integer; mData: Array of string; x: Integer; Begin result.fiSize:=-1; if aHandle.valid then begin mName:=w3_getStyleAsStr(aHandle,'font-family'); mSize:=w3_getStyleAsInt(aHandle,'font-size'); if length(mName)>0 then begin asm @mData = (@mName).split(","); end; if mData.Length>0 then Begin for x:=mData.low to mData.high do begin if Detect(mData[x]) then begin result.fiName:=mData[x]; result.fiSize:=mSize; break; end; end; end; end; end; end; function TQTXFontDetector.Detect(aFont:String):Boolean; var x: Integer; Begin aFont:=trim(aFont); if aFont.Length>0 then Begin if FBaseFonts.Count>0 then for x:=FBaseFonts.low to FBaseFonts.high do begin Fs.style.fontFamily:=aFont + ',' + FbaseFonts[x]; Fh.appendChild(Fs); result:= (Fs.offsetWidth <> FdefaultWidth[FBaseFonts[x]]) and (Fs.offsetHeight <> FdefaultHeight[FBaseFonts[x]]); Fh.removeChild(Fs); if result then break; end; end; end; function TQTXFontDetector.MeasureText(aFontInfo:TQTXFontInfo; aFixedWidth:Integer; aContent:String):TQTXTextMetric; Begin result:=MeasureText(aFontInfo.fiName,aFontInfo.fiSize,aFixedWidth,aContent); end; function TQTXFontDetector.MeasureText(aFontInfo:TQTXFontInfo; aContent:String):TQTXTextMetric; Begin result:=MeasureText(aFontInfo.fiName,aFontInfo.fiSize,aContent); end; function TQTXFontDetector.MeasureText(aFontName:String;aFontSize:Integer; aContent:String):TQTXTextMetric; var mElement: THandle; Begin if Detect(aFontName) then begin aContent:=trim(aContent); if length(aContent)>0 then begin mElement:=BrowserAPi.document.createElement("p"); if (mElement) then begin mElement.style['font-family']:=aFontName; mElement.style['font-size']:=TInteger.toPxStr(aFontSize); mElement.style['overflow']:='scroll'; mElement.style['display']:='inline-block'; mElement.style['white-space']:='nowrap'; mElement.innerHTML := aContent; Fh.appendChild(mElement); result.tmWidth:=mElement.scrollWidth; result.tmHeight:=mElement.scrollHeight; Fh.removeChild(mElement); end; end; end; end; function TQTXFontDetector.MeasureText(aFontName:String;aFontSize:Integer; aFixedWidth:Integer; aContent:String):TQTXTextMetric; var mElement: THandle; Begin if Detect(aFontName) then begin aContent:=trim(aContent); if length(aContent)>0 then begin mElement:=BrowserAPi.document.createElement("p"); if (mElement) then begin mElement.style['font-family']:=aFontName; mElement.style['font-size']:=TInteger.toPxStr(aFontSize); mElement.style['overflow']:='scroll'; mElement.style.maxWidth:=TInteger.toPxStr(aFixedWidth); mElement.style.width:=TInteger.toPxStr(aFixedWidth); mElement.innerHTML := aContent; Fh.appendChild(mElement); result.tmWidth:=mElement.scrollWidth; result.tmHeight:=mElement.scrollHeight; Fh.removeChild(mElement); end; end; end; end;
You must be logged in to post a comment.