Main

28.08.09

20 years Wall of Frame - Personal Perspective

20years_preps_muldaDeli_atWork.jpgThere I was unsuspectingly checking the proto.lab yesterday - expecting to see two lonely artist trying to to save our exhebition tomorrow working their butts off. What I was greeted with was an enthusiastic crowd of 10 artists rocking the house, transforming the boring white walls of our lab into an installation many magnitudes bigger then what I was ever expecting. Not only were they creative they were on the case organizing things writing down what had to be done til the opening and generally there was a so much energy in the room - I could hardly believe it. Needless to say that that made me very happy - as the proto.lab is like most of our projects a prototype - a test balloon trying to sail in the 21st century society and trying to set a positive statement. I was expecting a long drawn out process of trying to get people involved and trying to stick to projects that are somewhat vague and on first look don´t really have a reason for existing. Yet from the start on two month ago people started to show up and started to work on multiple things. When master Deli came up with the idea of making a streetart exhibition it was immediately clear to me that it would be a good idea because that would make the boring white room a bit more creative looking. Never would I have thought at that point what great feedback we would get from the local young (street)art community and how transformative the process would be for the room and the group in whole.
Tommorrow now is the big day of the opening but I can not thank everybody involved enough already - because no matter how many people show up or how well the event will be recieved by our local politicians and residents (who tend to be not quite exactly the target group for a street art exhebition - if you have been around here you know what I am talking about) - it is already a huge success - for all of us because we have shown that we can come together and rock the boat.
I am greatly touched and inspired by the big unified artwork you have created and I look forward to celebrate that with everybody who has helped or is generally interested of what the artists of the south Berlin suburbs have been up to in the last 20 years after the wall came down. So come by tomorrow 3 p.m. at Kleinmachnow Meiereifeld 33.
Everybody is invited to extend the exhebition with their unique piece either by bringing it along - creating it on the spot or sending it to us via lab(at)prototypen.com for print out.

17.09.08

proto.lab: fader.flux

fluxfader.pngproto.lab proudly presents the first proto.flux production. Kiritan Flux made this handy little tool called fader.flux. Here is what it does:

fader.flux is a simple tool that maps the continuous controller values that come out of the midi-controller DJ2 from Faderfox into a non-continuous range of values and outputs them to the OSC protocol. This tool is mainly for use in conjunction with software that does not understand the 12 continuous controller values of the DJ2.

download
Mac Standalone Version (MacOSX 10.4+ Universal Binary) - 5.4 MB
Collective Version (Max/MSP Runtime Environment 4.6 required) - 1.4 MB
Win32 Standalone Version (Windows XP) - coming soon…

30.03.07

Pimp Shake Command Line Renderer with Python

In a perfect world all software would work in all situations and all Scripting languages share the same syntax. In the real world my love for shake erodes at the command line and I need to learn Python (on top of Javascript, MEL, PHP and Perl).

The Probelm:
While Apples Shake as in Version 4.1 supports command line rendering through

shake -exec shakeScript.shk 

and employs methods for adding a sFileIn node and adding a fileOut node it lacks in an important area. There is no way to do the following from the command line:

a) change an existing sFileIn node or change an existing fileOut node
b) change the inPoint and outPoint to be rendered to reflect a sFileIn's node frameRange

So any serious batch processing from the command line that takes different input movies with variable length, processes them and outputs them to different output movies with a descriptive name (like a version of the input movies name) is not possible in Shake at the moment.
But Shake is such a Unix beauty of a program and the shake script documents are easily readable text documents. So I set out to make a small Python script that adds exactly that functionality to the command line: a simple batch renderer.
After trying with a normal shell script that in the very early stages got scrapped because I got some kind of permission error while reading out the movie length information. Then I had a brief try in Perl but realized that my Perl is not sufficent and that if I learn a new script language it should be Python because - well - god and the world swears on Python these days. Especially the recent inclusion of Python into Autodesk Maya 8.5, Python beeing included into Blender 3D and about any and all cool new program in the visual FX area putting the python in the tamers basket it is quite clear that I should get familiar with it rather sooner then later.
So eight hours later I can release my first Python script which is a bit crude but does what its meant to do simply and well. I have not figured out why out of all the languages it must be Python that seems to win the script wars and some of the Python logic is not absolutely clear to me yet but coding was indeed almost fun at points (those where things worked as expected).

Preperations:
1.) make a Shake Script that does what you want to do.
2.) The Shake Script should have exactly ONE sFileIn (preferably you want to use one of the movies out of the batch so everything is set up right (pixel size, aspect ratio etc etc)
3.) and also the Script should have exactly ONE fileOut. While it doesn´t matter what path or name the FileOut is referencing too (it will be replaced in batch)
4.) it DOES matter that you make the fileOut a Quicktime Movie with your desired compression settings - otherwise the script will fail.
5.) save the Shake Script with a descriptive name

How the batch works:
1.) make a folder
2.) put the Python Script into the folder
3.) put your Shake Script into the folder
4.) put all the movies you want to have processed by the Shake Script into the folder
5.) navigate to the folder in the command line and make the folder current

cd drag`n`drop folder
(for the non terminal users)
6.) execute the script by
'./ShakeBatchRender.py'
7.) Watch your movies being rendered

Features:
1.) The script creates a Shake Script for every input movie. So f.e. you make a bluescreen key and it works on 9 out of ten 10 shots you can just open the one shot it didn´t work and adjust settings and render from within shake. So the sFileIn and the sFileOut and the frameRange is set correctly when you open the script for the problematic scene. The Shake Scripts are named

MasterShakeScriptName_InputMovieFileName.shk
2.) The scripts makes descriptive filenames for the output movie
InputMovieFileName_ShakeScriptName.mov

Caveats:
1.) the script only supports .mov for sFileIn
2.) the script only supports .mov for sFileOut
3.) the script should only be run ones. If you need to run it multiple times (to correct a problem) make sure you delete all the newly created ShakeScripts and Movies, otherwise you end up with multiple versions and duplications and a general mess.
4.) if the script does not run make sure the permissions are set correctly. Do this by typing the following in the command line:

chmod 744 ShakeBatchRender.py

Conclusion:
Its crude but I might develop it into something more interesting if Apple is making future versions of Shake (which I really really hope). If not then this is a crude hack to learn Python.

License:
Script is under GNU General Public License v2.0. See enclosed document called "COPYING_LICENSE.txt" for more information.

Download:
Download the Script with documentation and a tutorial folder here. (700kb)


Technorati Tags:
, , ,


24.03.07

Web Development in 2007

Or: Does Internet Explorer suck intentionally?

I have just completed a small webproject — nothing life changing, nothing fancy, nothing ridiculously cool and generally a very very small project — perfect to fiddle around with some technology that I wanted to employ for a long time. Me — not having done semi serious web-development since I redesigned this blog — was curious on a number of questions especially "What is wrong with the Internet Explorer and why is it wrong?" but we get to that way way later (the most interesting part but the rest leads to it) first was some basic fact checking:


"Whats the backwards compatibility path used these days?"

Browsers change, get recoded, dissapear, reappear and fight with each other — thats generally the situation each and every webdeveloper on the planet faces on it first website. There is no way to support every browser on the planet in every version number ever released — period. Everyone who looks at Internet Explorer 4 Mac Edition and makes a website work on that the same way this website works on a Firefox 2.0 installation is getting a lifetime "you waste your time" award.
Generally I approach this question with a very certain bias. First and foremost I want to push technology to the edge but because — as stated here before — only used technology gets pushed forward and gets redeveloped, reused and generally more useful.
But I am sidetracking. So being in the beginning of 2007 someway into a new fresh awesome millennium full of (sometimes) exiting technological advancements how far does a normal "mom and pops" website that needs to reach a broad general audience across the full spectrum of age and technological knowledge needs to push it with webbrowser compatibility. Since this websites intents to sell you something it needs to work — easy clean simple and perfect. Now if we look at browser statistics these days (which I don´t believe any of them, but I generally don´t believe statistics that I haven´t come up with myself so that point is mood) the field is WIDE and open. General consensus is that there is a total four browser engines (as for computer web browsers that is — more on that in a minute) on the market that are worth looking at.

1. Internet Exploder
2. Mozilla/FireFox/Gecko
3. Safari/KHTML/Konqueror
4. Opera

For me personally and for this specific project one browser falls out right away from consideration. I am really sorry my open source, we make everything different then the next door guy just so we are different and are percieved cool friends — Opera is a nonevent for me and I would venture to guess about 99% of webdevelopers. Yes according to statistic Opera has a whopping 1.5% market share. I meet only two persons in my whole life that use Opera and if those two persons (totally unrelated to each other) give a picture of the Opera using population then its save to say that they are technology savvy enough to figure out that when a website doesn´t run it might be the browser and I am sure they have 10 other obscure browsers on their machine to access the site. That goes also to the Opera Team — if you want your browser to be used make it 100% CSS/HTML/XML standard compliant 100% Javascript/DOM compliant because the webdevelopers have a life and really there is enough problems to fix then looking at every obscure "me too" browser on the market. I really do love and try to support small software rebells but my prior experience with Opera was so BAD (in development terms) that I am absolutely sure ditching this will not cause any ripples in the space time continuum and give me at least 10% more time out of web development to rant here.
With this out the way you dear reader might say: "Hey but what about Safari/KHTL its similar obscure and has a similar small market share." Yes, dear reader on the first look that might seem so, from personal experience I can name about 100 ppl (compared to the operas two!) using Safari daily, because it comes with a pretty more or less widely used operating system called MacOSX and as it is with these bundled apps — some people think they have no choice as to use them. The great thing about Safari — besides being bundled and forced down mom and pops throat — and totally unlike Opera (never used 8 but used 7 and 6) its about the most standard conform browser on the planet — yes even better then Firefox. Its even recommended as a standard reference platform (where ever I read this if I find I post a link). So even with a tiny market share that I personally would think is really at least five times as much as in the "statistic" the developers of KHTML/Konqueror together with the enhancements of Apple did something Opera has utterly failed — eliminating the need to specifically test for this platform — when you adhere to the standards set by W3C you can be 98% sure that it runs, looks and works beautifully. Thats in Javascript/DOM, CSS, XML, XHTML.
Another great thing about it is, that its automatically updated (the Safari variant — Konquerer users prolly are also very up to date as Linux users are in general) with the system so you can be sure that most people using Safari are also on one of the last versions. So testing is constraint to 2.0 and onward.

Moving up the percentage ladder we approach the burning fox. While I was not very impressed by the early siblings (first Camino f.e.) Firefox is now a stable mostly standard conform plattform and with the FireBug Plug-In has become my main webdevelopment choice (this might change with OSX 10.5 but I can´t say anything — NDA). So its clear that my developments will work out of the box with Firefox and hopefully all Gecko compliant browsers. So how much versions back you need to test? I don´t test anything before 1.0 because before 1.0 most people using Firefox are assumed intelligent fast technology adopters and they prolly have the latest version of Firefox installed. Actually I am not even testing on later versions then 1.1 at the moment because I think the majority will only upgrade to X.0 releases and they hopefully didn´t break things that were working (and you can not under any circumstances be responsible for any and all nightly builds there ever are for any open source browser anyway).

With those out of the way we get to a point where things are starting to get ugly. This point is the internet venom called Internet Explorer — or nicknamed by about every serious web developer: Internet Exploder. In case you have not heard of it — its the browser that Microsoft "pushed" out the door in the mid 90s to combat Netscapes dominance in the early internet. Its the browser that started that browser wars, killed off Netscape (temporarily) and has since earned Microsoft a couple of Antitrust lawsuits and A LOT OF HATRED among web developers of all kinds. The problem is: Microsoft won that browser war (temporarily) and the antitrust lawsuits have done nothing stopping the bundling of that browser on the most used Operating System in the world — namely Windows that is. So with about 60% browser market share as of last month (if you want to believe the linked statistics) it has more then double of Firefox and just can´t be ignored no matter how much you swear. Now all this would only be half as bad but those 60% are quite unevenly distributed between the three main version numbers 5.0, 6.0, 7.0. And looking at the individual percentages, each has more then double the percentage of Safari so you better support em all. Heck I would even throw in 5.0 Mac edition for the fun of it because I have personally witnessed people STILL using that! Now a person not experienced in webdesign might say "hey its all one browser and if you don´t add any extraordinarly advanced function 7.0 should display a page like 5.0 and everything should work great.
Well without going any further down a well worn path I can only say this: It fucking doesn`t. If you need to support people using Internet Explorer you need to go back to at least version 5 including the Mac Edition.
Now if Microsoft would have tried to support web standards like they are kinda set in stone by the W3C this would all be only half a problem. Microsoft has chosen to go down their own path and alter little things in the W3C spec — mostly known is the box modell difference in CSS.
(I am going to get inside that in a second — just need to find a way to round up this section.)

What I haven´t touched yet — because a clear lack of experience — are phone and other device browsers (gaming consoles). For this specific project this was no issue as I think the people using a phone to order a highly specialized documentary DVD is close enough to 0. Gaming consoles are definetly not the target group of this DVD either. For up and coming sites out of this office I will clearly look into the "outside of the computer" browsers and will surely post my findings here — generally I think they all going to move to an open source engine like Gecko/KHTML sooner or later (iPhone will drive that, Nokia already decided to use KHTML f.e. the Wii is using Opera - tried browsing on the wii and it sucks bad compared to all the other stuff thats beautiful on that machine).

To round this up: If you want to reach the mom and pop majority of the Web I concluded you have to test on Internet Explorer back to version 5 (including Mac Edition), Firefox 1.0 and upwards, Safari 2.0 and upwards. You also may want to check your site with a browser that neither does Javascript, Pictures or anyhting in that regard to make sure your site is accessible for the blind and readable by machines (google robots f.e.).

Now with that out of the way the next question formed in my head:


What content management system should I use?

While this specific project has little content updates, it still has some (adding new shops that sell the dvd to the reseller list f.e.) and more important — it had to be deployed bilingual. So both of these consideration prompted me go with a CMS. Now I don´t know if anyone has looked at the CMS market at the moment — I have done some intense research (also for a different project) and its quite a mess. There are basically two types: Blogging centric and Classic CMS centric and a lot of in between breeders.
Since I don´t want to bore you too much: most of the Open Source CMSs can be tested out at the great site opensourcecms.com.
Personally the only CMS I have used (and I am still using for various reasons, but basically I really do like it) is the blogging centric Movable Type (not open source and costing some dough). But Movable Type is so blogging centric that doing anything else with it is counter productive (but can be done). So me — feshly in the CMS game with knowledge that "blogging centric" is not something I want here — looking at all the options found out that its very hard to decide on one from pure looking. The user comments on opensourcecms.com are very helpfull already in siffing out all the once that had pre beta development status. Left over are basically the big three CMSs Typo3, Mamboo, Drupal and Plone. All with their own good and bad sides. The one from pure technological standpoint and feature wise and stability wise that I really really liked was Plone, but Plone depends on Zope and for Zope you need a very heavy duty server that runs just that — I don´t have one. The learning curve for Typo3 seemed much too high for me — thanks I am already used to PHP/Perl/XHTML/Javascript/CSS etc. and I have no need to learn another obscures description language on top of that just to use a CMS.
This left with Mamboo and Drupal as the likely choice. Mamboos architecture seems dated and is at the moment in a state of flux and recoding — I do not like unstable software and have no need to beta test more warez then I am already doing — so mamboo called out. Drupal came out as the winner in the CMS field at the moment — but NOT SO FAST my friend. I installed it used it. It has a flashy web2.0 interface with lots of usefull functions. Well there are SOOO many functions that would never been needed for that project. Also it is very heavy on the server (and I had no urge to get into the 200 page discussion on optimization techniques for drupal on their forums) in the default install. It became clear that this CMS is not the solution, the only function that I was really looking forward too was including Quicktimes and MP4 in an easy way. It turned out that including these is easy — having them show up in the way I like it and not the Drupals developers vision of "another tab in a tab in a window in a frame" proofed also extremely difficult.
Now this left me with either going with a fast hardcoded website that I would need to maintain the next 5 years or dig up a CMS that I used before and had almost forgotten about — x-siter.
This CMS is written by the fellow CCCler Björn Barnekow and is totally unsupported in any way other then "ask him and he might reply". The beauty of it is that it is ULTRA lightweight — he himself describes it as the smallest (code wise) CMS out there. It is totally PHP and even if you have no plan on PHP its very very easy to understand what it does. From an enduser perspective who needs to maintain the site, the approach is unrivaled and beautiful, because you just edit paragraphs of text, add pictures etc on the page they belong to. So no fancy metastorage system for all the pages or folders containing the pages you edit it right inside the individual page. Now this has a huge advantage if the person you need to explain how to update the site is you general secretary or such because she browses to a page she wants to change then logs in edits it and logs out — its very very close to WYSIWYG editing and very easy to explain to everyone.
The layout possibilities with x-siter are also well thought out, giving you an adjustable typographic raster that you can put pictures in or text etc. A very nice approach. So why is not everyone with a small website using x-siter then and why has nobody heard of it? Well first of all its more an internal tool for Björn that he shares without any official support and not much documentation inside the code either. He thinks you might not need to touch much code and generally he is right, sadly design realities are different and I have a concept of how a website needs to look like in my head and if the CMS tries to get me to do it differently I rather adjust the code of the CMS then adjust the look in my head. And this is where the x-siter shows big weakness because the code is not very modular and not very good commented so I had to change quite a few stuff and now the code can not be easily upgraded. But generally if you need a very fast small site that needs to be updated now and then x-siter is definitely worth looking into. Even one Max-Plank-Institute switched from Plone to x-siter because its soo much faster and actually scales quite nice and has a good user account management. So it does lack some advanced things from Drupal, generally I do not miss those feature (and most can be added through different things anyway).

So I employed x-siter and heavely modefied it to get my certain look that I wanted (taking some headlines out of tables and into css divs etc). Since the site is pretty simple I needed an excuse to implement at least one advanced cool dynamic function in there.


What cool "new" "web 2.0" technologies are really cool and worth testing out and are generelly usefull for this small website?

Well I couldn´t find much. First I thought I rewrite the whole CMS to make it fully dynamic (without any full page reloads — imagine that) but thanks god I did not go down that route. There was one function of the page though that definately needed some design consideration. Its a function that on return kinda breaks the whole look and feel of the full app by generating an ugly error/confirm html page.
That function is a simple formmailer to order the DVD. Now this function — needless to say — is also the function that is most important on the whole page.
So my thinking went down the route of "Hey I want to let the return page of the formmail.cgi replace just the "div" of the form. If there is an error I should be able to go back to the form and correct it (without having to completely fill it out again)."
Great thats a simple AJAX request to the server and putting a returned HTML into the DOM of the current page with the option to return to a saved state of the DOM. YIPPEEE some coding fun — or so I thought.
Generally implementing this with FireBug and SubethaEdit was so dead easy — almost scary (look at code on the bottom) easy. Here is how I did it:

First I replace the normal form button with a ordinary "javascript:function()" link button inside a Javascript below the normal form button. That ensures that people without javascript still can order the DVD through the normal nonAjax/ugly/unintuitive way of doing things in the last millenium. Those people get a normal submit button, while people with Javascript enabled get the AJAX button because those people should also be able to use the AJAX functions.
So user fills out the form and hits the AJAXified "submit" button. The formdata is then send over a standard asynchronous connection through a standard xmlhtrequest. At this point you already add browser specific code but this has been figured out and the code you add JUST for Internet Exploder is already 3 times as long as the code would be if that browser would function normally.
Anyway the data is then processed using the normal nms formmailer.cgi. And this returns a (sadly non XML) html page. I then parse this HTML and look if its an error or if its even a server error or if the result is an "ok" and then let specified output for each case drop into the DOM if the webpage in the only correct way (which is NOT innerHTML!).
Before I exchange the data of the webform with the result page I save the webform with all its content in a DOM compatible way use cloneNode. (its just one line of code I thought!). So if the result is an "ok" I purge the stored data and tell the user the order is beeing processed and this and that data he has send. If there is an error there is a javascript link in the result page that when clicked on exchanges the result page with the form and all its content.
So far so good. This part coding with learning the whole technology behind it took me three hours.

So the website looked great the functions worked as expected and since I am hyper aware of the CSS box model issues of Internet Exploder it even looked great in Internet Explorer 5 on the Mac. At that point — a point of about 20 hours of total work (including digging PHP of x-siter!) — I considered the site done.
BAD idea.


Problems with Internet Explorer 5.0, 6.0, 7.0

First thing I noticed was that IE 5.0 Mac Edition does not support xmlhttprequest AT ALL, also it does not do any DOM. That made me aware that a very very few users might have browser that a) has javascript b) but does not support any of the modern AJAX and DOM functions.
Well that was easily fixed by trying to figure out in the very first called script (the one that replaces the submit button) if xmlhttprequests can be established. If not its the ugly normal nonjavascript version — if yes then the browser should be capable of getting through the whole process.
Again a sigh of relief from my side and 5 minutes of coding later I closed up shop for the day. The next day would just be some "routine" IE 6.0 and 7.0 testing on the windows side and then the page would be done — so I thought. I was very happy because I had very portable future proof code that was small and lightweight and easy to understand. There wasn´t even a bad hack in there except for the xmlhttprequest check.
Opening the page on the Windows PC with Internet Explorer 7.0 my heart dropped to the ground like a stone made out of lead in a vacuum on a planet the size of our sun.
Nothing worked. The layout was fucked (I am quite used to the CSS box model problem so I avoided that in the first place!) and the submit thing did not work at all - clicking on the button didn´t do shit.
The CSS was fixed after fiddling with it for two hours, there seems to be a "bug" in IE 7 with classes vs. ids and z-depthes. Fixing the Javascript was much harder. I couldn´t use FireBug to get to the problem — because in Firefox the problem didn´t appear. The IE7 debug tools are as crude as ever (a javascript error console, which did not produce any errors).
So I placed strategic "alert"s in the code to see how far it would get an what the problem was. It turned out the first problem is that it can not add an "onclick" event listener to something I changed inside the DOM after the DOM was drawn (Add 3 hours to the time total). I struggled for a solution and rummaged the web for any clues. It seems that IE up to version 7 (that the current one!) can not do "setAttribute" as the W3C says but instead you have to set every attribute through (object.attribute = "value in stupid string format";) so for a link its (object.href = "http://your link";) instead of just putting it all in through object.setAttribute("attribute", "value");
Now if you think you can also add an event listener this way by doing "object.onClick = "function";) forget about it. It seems through extensive testing (2+ hours) that there is currently absolutely no way to add an event listener through normal means to an object that was created after a webpage was build — in Internet Explorer that is — again Firefox and Safari this works wonderfully. So my solution was to use the "fake" onclick through a href="javascript:function();" thanks god that this existed otherwise I would have been forced to either write 200 lines of codes for custom event notifiers (which I am not sure that they would have worked) or abonden the already working approach alltogether.
If this still sounds too easy as to call a serious deterrent yet — this was not all that solved the problem. Because creating an object and then before writing it in the DOM setting the href attribute to the "javascript:" also does not seem to work in Internet Explorer. I had to actually write it in the DOM pull the object ID again and then change the href attribute. This doubled the code for this portion of the javascript.
Now the problems were far from over yet. As you might remember from earlier I save the part of the DOM containing the form with content so I can get back to it in case there is an error with the form. This worked great in my two standard conform browsers. So I am at this point where every important browser can get the form mailed out. If you have an error you get the "back" button which through the object.href="jvascript:" way I could also make work in IE. Now I was sweating that this whole "cloneNode" might not work in IE and I would have to parse the whole form and rewrite it, but clicking on the now functioning button did in fact work as advertised (by me) and I returned to the form. But the trouble was far from over because now my "submit" button didn´t work again. Imagine that — I am cloning the DOM as in CLONING that means I take each object with all attributes and values and CLONE IT into memory. Then when I put in this CLONE this should be just like the original — well it is in Firefox and Safari. Internet explorer seems to choose which kind of attributes and values it clones and which not because the values of the field had been cloned and correctly copied back, yet the href attribute seems not clone worthy and is completely forgotten. At that point I had been sitting a full day at debugging IEs Javascript/DOM implementation. So on the third day I just made the dirty "grab the object again and change the href manually" hack and let it be.
In general I have a recorded 27.4 hours of development/design for the whole page including php scriping (that I have no clue off) and 13.6 hours of IE CSS/jacascript debugging (where I have a very thorough understanding). My code bloated 1/3rd just for IE hacks and is less easily readable or manageable in the future. And it definitely runs slower, not that you notice in this small app but extrapolating that to a huge webapp like googles spreadsheets I think the speed penalty is mighty.


Why is Internet Explorer so bad?

Throughout that project (and many before) I have asked myself this question. I am no programmer and have no degree in computer science, yet I think that it can´t be THAT hard to make a good standard compliant browser. Microsoft had about seven years time (yes SEVEN YEARS at LEAST) to make Internet Explorer standard complient, listen to the complains of the million of webdevelopers and redo the whole thing from scratch — in the meantime Firefox has made it to version 2.0, Safari has made it soon to version 3.0 and even a small shop can churn out a browser with version 8 — even so its not perfect Opera is still MUCH better then IE ever.
Now Microsoft is swimming in money and has thousands of coders in house who all probably are a million times smarter then me when it comes to programming. The bugs that are present are enormously obvious and waste millions of hours of webdevelopment/design time. The box model problem should have been just adjusting some variables and the javascript engine — well after a rewrite of the program with native DOM support (at the moment its a hack I would say at best) all the problems should be gone.
Now Microsoft had the change of fixing this with Internet Explorer 7 and while transparent PNG support (a hack) and finally a fix of the box model problem (also not 100% fixed so I heard) has been touted as breakthroughs in supercomputing or something the whole DOM model Microsoft uses does not work (and they admit that on the IE dev blog — promise to look into javascript/dom for 8.0 — in 10 years). That at a time when the whole web wants to use DOM model stuff to make rich web applications with great consistent interfaces. I have looked into some of the AJAX make my webpage move and look all funky frameworks and I can tell ya -> they have the same problems as me and sometimes more then half the code in them is to get around IE limitations — which slows them down to hell I guess.
So IE 7 is almost a non event — I am asking now even louder WHY COULDN`T MICROSOFT FIX THOSE THINGS.

First my personal take on this — they have no clue — this multibillion dollar company has no idea why someone would want to consistent interface on a website that doesn´t reload just because a number changes some were down in the text of the webpage. The reason I think that: Look at Vista. Vista is flashy, has an inconsistent interface (I just say 10 functions to shut your computer down!) and uses every processor cycle available on your CPU and GPU just to run itself (so much that not even cheap modern laptops can run the full version flawlessly and fast). So when they not realize that this is important for themselves why would they realize that these are important concerns for developers outside the Microsoft bubble.
Now pretending that a multimillion dollar company is too dumb to make such assumption is probably as dumb as to think that the Bush administration has no advanced hidden plan on what they are doing (well you know as with microsoft — they could be just plainly dumb or have some greater goal that nobody fully understands or if they understand are not having a loud enough voice to make it public).
So since we are not dumb over here we stumble upon ideas why this is all the way it is. The best take is by Mr. Joel on Software in his blog entry called API Wars. Joel is a software developer writing Bug tracking software for Microsofts Operating Systems. He is very known and very respected by the developer industry and his sometimes controversial statements cause quite a stir now and then but he — being very much inside the OS and probably able to read Microsoft assembly code backwards — is most of the times right spot on. In the linked article he talks about how Microsoft has always been about protecting their main treasure — their API. The Office API with the closed document structure the crown jewel above everything else. Well he also talks about how they have so many APIs that developers are getting confused and since most of the APIs are for sale developers nowadays turn away from the APIs Microsoft provides and are turning TO THE WEB to develop especially small applications — the reason most shops are on windows is believed to be exactly those simple small application only available on windows + office.
Now I said "the web" — the web has the notority to run on any OS somehow and manages to circumvent Microsofts monopoly in a nasty uncontrollable way — poor Microsoft.
Now you probably understand where Joel and I am are getting too — yes Microsoft does anything to stop the web from spreading too fast and get too usefull before they haven´t found a way to completely control it — and they try so hard, DRM, TCP etc etc are all designed to control webcontent — good thing that webapps are not content and Microsoft is acting too slowly. When you read through Joels entry you get quite a clear understanding that Microsoft is not interested at all to make webdevelopment easy and the longer their Javascript does not work and their interactive DOM Model does only work in some strange emulated way (change something in the DOM and look at the HTML source — you will not see you change reflected there) the longer they have a monopoly — and that their monopoly is threatened by webapps is apparent by Googles spreadsheets and word app — sadly for Microsoft these already run even on a broken IE 7 (poor google guys who had to debug that).
I do see that Microsoft will try to "fix" these problem — because a) this gets into a publicity backlash that not even Microsoft can hold out against (the BOX model problem circulated for years but then gained so much steam that Microsoft HAD to do release a press release stating that they are working on IE 7 that adresses some of those things — that was three year ago or so). Because so many developers/admins/techddicts/etc. suggested to friends that using Firefox is more secure (thanks god IE has had soo many open doors for viruses) that Firefox usage exploded and is now eating slowly but steadely into the IE market share. Now Microsoft faces a very big problem — a problem so satisfying for webdevelopers that I would sell my kidney to see this unfold (maybe not but I would hold a party just in case of that event). Microsoft understood in the mid 90s that if they have large enough a market share they can force any competing browser out of the market by introducing proprietary standards (ActiveX, own implementation of CSS and many more) because the webdevelopers are going with the browser with the biggest market share. That worked quite well and played out like they intendet — almost. The webdevelopers are a strange bunch of people and some are using Linux and other evil stuff so they ensured that big companies tried to stay cross browser compliant (I am one of those who wrote about 100 emails to big companies telling them their website doesn't work on browser xy and the millions others doing that are responsible that we do not have ONLY Internet Explorer today — Microsoft really almost completely won — this was close). Now back to the point and to the happy future lookout. If Internet Explorers market share would drop below — lets say — 10% I am the first person who would drop Internet Explorer support completely and rather debug small things for Opera in addtion to the two nicely working Firefox and Safari browsers. My thinking is that the hatred for Internet Explorer among webdesigners/developers has grown to such proportions that I would NOT be the only one — heck I would even say this browser fades from visibility in less then a year. This would satisfy so many people who have lost so much time in their life just because some stupidly rich multibillion dolllar company wanted to get richer on their backs. Now this is not unfolding right this minute but I can see that if they wait too long with IE 8 and fixing the aforementioned javascript/dom problems this might go faster then everyone think. The webtime is a hundredfold acceleration of normal time and a monstrous creature like Microsoft might not be able to adjust fast enough — one can really hope.

5.07.05

The future of the web is not the browser

incompixow1_0.pngMaking tools myself that take information in form of rss feeds and xml streams etc. and putting them into new cloth I get more and more the feeling that the future of the web sits not inside the browser window. As more and more information comes available this information craves for the attention of the world population and to distant itself from other information the browser seems not the best tool. I am not talking about flishy flushy flash sites I am talking about proper use of this information and using them in a manner that presents the information so its easier and faster to understand and comprehend. Its tools like widgets screensavers blogtvs tickers and tools that will yet have to be invented that will take the web to a version 2.0 where information is not only many and pretty to look at but also will be presented in a form that will make it easier to follow the information trail that is of interest to the viewer. For now take a look at my screensaver that I did for the incom workspace tool of the University for Applied Science Potsdam to see a concept of what I mean. Pictures and user definable RSS streams are mixed to give an outsider an overview of the school and its workwebsite presence, the newcomer a way to find use for the tool and for the insider user a nicely way to follow what is going on while taking a break. I am not saying its anything new I just come to the conclusion that this is where part of the web is headed and I really like this.

PS:The screensaver is programmed in Apples Quartz Composer and runs only under MacOSX 10.4.1 Tiger and with a graphic card of at least 16 MB. I would like to thank Pierre-Olivier Latour the Quartz Composer Architect for his help throughout the little project.

23.02.05

MEL Script Release

The intro of the music video I mentioned yesterday was created with Alias Maya. I had to write two MelScripts that got rather big and are almost full blown plugins. Watch the first 30 seconds of the movie to see them in effect. The first is a "random neon light flicker on". If you ever watched multiple neon tubes switched on you might have noticed that they randomly blink and flicker until they are fully turned on. I figured a little algorithm that quit naturally mimics this effect. This algorithm is then applied to new textures for geometry objects and light brightness for light objects.
The other plug in I call "spatial movie" as it spreads out sequence pictures over some planes (or other geometry) when you move by with your camera it makes the movie sequence come to life. Almost like film works. You see this in the very end of the movie when the 3d part blends over to the real life footage.
All MEL scripts are released under the GNU General Public License and is copyrighted by me 2005. Feel free to use the scripts if you ever need them. Send me the results if you like - that would make me happy. If you enhance the code please follow the spirit of free software and make it available to the public. The release also includes manuals and tutorial projects. The scripts should run in Maya 5 and up. Now enjoy:

fALks MEL script release No.1 (40kb)

22.02.04

700GigaByte inside a G5

After a bit of fizzling I have now fully functioning 700GB G5 under my table. That is with a 500GB Serial ATA Stripe Raid (6Y250MO and 7Y250MO Maxtor) and a 200 GB ATA (6Y200PO Maxtor). I have not done full benchmarking but can report that the Raid is a beast, I had peak out yesterday at 112MB/s read performance. Benchmarking and Pictures will follow.
I just want to say that I hate Apple for not using the room that is available. There is so much space above the optical drive that not only could I put the drive there I could even elevate it with some plastic screws about one cm from the optical, giving me enough safety room for cooling and a way for the wind blowing through the drive fan.
The cable mess is less then expected even with a much too long IDE cable. I had some problems with the optical beeing slave so I reversed it again and it works beautiful with the optical drive (Superdrive) as master.

11.11.03

Discussion the OSX for Realtime Performance on ArsTechnicaForums

There is a discussion going on about my document in the Arstechnica.com forum.


I was quite surprised to see my dokument discussed on arstechnica.com, my absolute favorite tech site on the net. So I was quite sad that all that is there is bashing and no constructive criticism. I do need feedback for this document as I want to advance this further and the more technical advanced people give me more scoop on what is going on inside this system and where we could get some more juice out of it.

arstechnica.infopop.net

as for some of the objections raised in the thread until now:

the font: This is not a GOTHIK font it is a Fraktur Font. and firstly I am an artist if I want to put out a dokument with a fraktur its my thing you can just copy paste the text and give it any font you like. Secondly I explained why I used this and I have emails sitting here telling me that this works quite well. Up to 50 years ago ALL written word was in this style so it must be readable don´t ya think? maybe your reading has degenerated into something that is not worth called reading? maybe you do not care for detail anymore?

the chmod 7777: There was no other way to make this script run on a friends computer as he run into permission problems, this was a problem with filevault and totally unrelated with the script. Yes it needs to be only 755 sorry about this...

apache is not using resources: that might be, but I tried to slim it down as much as I could, WITH EVERYTHING removed that MIGHT be causing trouble making the bootprocess slower etc. Apache I do not need in a realtime environment.

there might be problem with other stuff I removed:If you would have read the document I said explicitly: This system is only there for running this single Application. If you plan to do other stuff you will run into problems sooner or later. I know that deleting the cron jobs is not a smart thing to do on a heavy used system. I personally have seen the cron jobs kick in the middle of the night when I was standing in fron of 8000 people and this slowed down my computer. there was no way I could have stopped them manually. This document is for the lesser computer literate and I have a hard time to explain how to move the cron jobs to a time when there is no chance you have a performance. Besides I can never tell when I have a performance, sometimes I do gig during the day for a presentation or during the week during a festival or on sunday to monday night. So for me deleting the cron jobs was a good way to get around this problem. I try to eliminate ALL disk access from the system and cron jobs surely have diskactivity.

I don´t believe the 20% performance increase: Have you tried it? I did not believe it myself. I do have two system running side by side and I can clearly see this difference... Maybe I really need to do benchmarking thing. I hate benchmarks they are so far out of reach in a working environment. I have top and top tells me that I have up to 22% processing time free when running VDMXX under certain circumstances and under the same circumstances I have no processor free (0%) when running it under the non optimized system.

So if you have constructive criticism I like to hear it. Bashing is not in place except if you have something better to offer which then I would love to hear.

4.11.03

Deep from the proto.lab - Optimizing OSX 10.3 for RealTimePerformance

or how to make the BigBlackKitty run


Deep from inside our green laboratory comes this first release of dirty knowledge. It puts some fire under the BigBlackKitty released by Apple Computer Inc. two weeks ago. It is intended for those who crave for the last bit of performance and can live without gadgets and juices of this OS. It disables all services leaving only those running that are important to Realtime AudioVisual Performance (and number crunching). In preliminary tests I have seen an performance increase up to 20%. Talk about modern multithreaded operating system :/ The startup time is under 5 seconds after the kernel loaded. And before you complain NO I WILL NOT CHANGE THE FONT of the document. IT IS THERE ON PURPOSE explained in chapter one line one! Please submit it any additional setups. Any information regarding your (not)running systems is really needed as there was no outside lab testing and I would like to add a list of supported systems. I hope you enjoy this document.

OptimizingOSX103Realtimev2.pdf

Update: I forgot to put the appendixes in. If you already downloaded please get version 2 as there are some additions to security and the RamDisk that might be of interest to you.