Main

25.01.10

HTML5 Video - Introduction and Commentary on Video Codecs

There is a raging discussion out there on the web about the upcoming HTML5 standard and the inclusion of the video tag. Not about the tag itself but about the codec used by videos played inside that tag.
There is a firestorm by free software advocates that want the only codec to be used inside this tag to be the -largely- patent free open source Theora Codec - the other side wants the ubiquitously used high quality H264 video codec. I think I can weigh into that debate. If you don´t want to or know already about codecs, containers and its history jump below to "my take on the codec war".

I am a content producer have been following video on the web since the very very beginning, have advised firms how to handle video on the web, have struggled countless of hours trying to find the best solution to put video on the web and have so far refused to use flash to display video on the web. I always believed that the web should be fundamentally free of technology that is owned by one company that then can take the whole web hostage to their world domination plans. I have hoped that the video tag would be introduced much earlier in the game and have looked with horror to youtube & co. how they made adobe - a company who has basically stopped innovating 10 years ago - a ruler of the web when it comes to moving pixels.
Now this is finally about to change - or at least that is the intention by google, apple, mozilla and others who are pretty fed up with flash for very obvious reasons (its slow, development sucks, its proprietary, the source code of the creations is always closed, its slow, its slow as fuck, it eats energy from the processors like nothing else). It never really made any sense to put a container inside a container inside a container to display a video - the second most powerhungry thing you as a consumer can do on your computer (the first would be 3d/gaming).
Yet a video is not a video. A video to fit through the net needs to be compressed - heavely. Compression technology is nothing new but it evolves over years and years. Its always a tradeoff between size, quality and processing power. The "Video" codec by apple - probably the first "commercial" codec available to a wider audience looks rubbish but is insanely fast (it utilizes almost no processor on a modern machine) and the file size is pretty alright. It was capable to run video on an 8 Mhz processor mind you.
Over the years lots and lots of codecs have sprung up - some geared toward postproduction and some towards media delivery - there is a fine line - for the postproduction codecs you need full quality and try to save a bit of storage. Its videos that still need work and you want to work with a mostly uncompressed or losslessly compressed video. Processing power for decrompression is an issue because you need to scrub through the video - also compression porcessing power (to make the video) you don´t want to take ages because you like to work in realtime and not wait for your computer to re-encode a video just because you clicked on a pixel.

The other codecs are the "end of line" codecs - delivery codecs - made to compress the crap out of the video while "visually" loosing the least amount of quality and having the smallest possible file sizes. Here it doesn´t matter how long the compression side takes as long as the decompression is fast enough to work on low end computers to reach the largest available audience.

While production codecs are fairly fluid - people change as soon as a better becomes available - takes less then 5 month to have a new codec established (recently apple released the ProRes4444 codec - most postproduction companies are already using it (those that don´t use single pictures - but that a whole different story) - the delivery codecs are here to stay for a very very long time because in the time of the web people just don´t reencode their stuff and reupload it - if its there its there.

Now before I go into the format war and my take on it there is one more concept I need to explain shortly - containers. Flash is a container for a video with a certain codec displayed in it. So is quicktime, so is windows media so is Real Media. It gets confusing because MP4 can be a container and a codec at the same time. A container just hold the video and adds some metadata to the video itself - but the raw video could be ripped out of the container and put into another without re-encoding. This is what apple is doing with youtube on the iPhone. Adobes last "great" innovation (or best market move ever) was to enable the Flash container to play h264 (a codec) encoded videos. Since Apple (among everybody else who isn´t a flashy designer) thinks that flash sucks they pull out the video from inside the flash container and put it into the (now equally bad) quicktime container and so you can enjoy flash free youtube on your iPhone.
Now with the technicalities out of the way whats all the fuss about?
HTML5 promises - ones it becomes a standard - to advance the web into a media rich one without bolted on add ons and plugins that differ from platform to platform and browser to browser - its a pretty awesome development or most people ever developing anything on the web. Part of the process to make this the new standard is to involve everybody who has something to say and is a shaker and mover on the web to give the direction this standard is going. Its a tough rough ride - everybody and their mother wants to put in their tech their knowledge their thinking - I really would not want to be the decision maker in this process if you gave me a trillion.
The biggest and most awesome change in HTML5 - and the one the most abvious to the end user - will be the inclusion of media content without a freaking container that needs a plugin to display that content that only half or less of the internet population have. To make this happen at least all the big browser makers need to approve what can be played inside the new tags (video & audio).
This is where the debate heats up. I really don´t understand why audio doesn´t spurn the same debate publicly as does video - but its probably because google is involved with the video debate and can change the direction completely on their merit with whatever they choose to support on youtube.
The two competing codecs are Ogg Theora and H264. Now I am less familiar with the Ogg codec (but have tried it) but first a small history of H264. Back around 2000-2001 a company called Sorenson developed the first video codec that was actually usable on the web - there where different ones before but they all sucked balls in one of the departments for a great delivery codec. Sorenson made a lot of video people who wanted to present their work on the web very happy. Apple bought in and shipped quicktime with the Sorenson codec and the ability to encode (make) the video with this codec - albeit with a catch. To really get the full quality of Sorenson you had to buy a pro version - which costs a lot of money - the version that Apple included could play Sorenson (pro or non pro) just fine but the encoder was crippled to one pass encoding only. The real beauty and innovation was with two pass encoding - basically the computer looks at the video first and decides where it can get away with more compression and where with less.
Apple and the users where not really happy with this situation at all. So for a long time (in web terms) there was no real alternative to that codec. The situation was even worse because to play Sorenson you had to have Quicktime installed - before the advent of the iPod a loosing prospect - I think they had 15% installed user base on the web. It was the days of the video format wars - Microsoft hit with Winows Media (which sucked balls quality wise but had a huge installed user base) and on top Real Media (which was the only real viable solution for streaming video back then).
In the meantime another phenomenom happened on the audio side - mp3 became the defacto standard - a high quality one at that (back then) in the audio field. We the video people looked over with envy. When producing audio you could just encode it in mp3 pretty much for free on shareware apps and upload it to the web and everybody could listen to it. There was nothing even close happening on the video side. The irony is of course that its MPG1 layer3 (mp3) - part of a video codec - but the video codec side of MPG1 sucked really really really really bad. Quality shit, Size not really small only processor use was alrightish but not great.

Jumping forward a couple of years (and jumping over the creation of MPG2 - the codec used for media delivery on DVDs - totally unsuitable for web delivery) the Motion Picture Expert Group - a consortium of industry Companies and experts that developed (and bought in) MPEG1 and MPEG2 decided to do something for cross platform standard video delivery and created the MPEG4 standard (overjumping MPEG3 for various reasons - mostly because of the confusion with MP3 (MPEG1 layer 3). MPEG4 is a container format - mostly - but it had a possibility for reference codecs and the first one of these was H263 - this already was quality wise on par with Sorenson yet in a container that was playable by Quicktime and Windows Media - the two last standing titans of media playback (by this time Real Media mostly had already lost any format war). Great you think - well not quite - Microsoft wasn´t enourmously happy and created its own standard (as they do) based on MPG4 H263 called VC1 (I am not really familiar with this side of the story so I leave you to wikipedia to look that up yourself if you are so inclined). Web-video delivery was still not cross platform sadly and the format war became a codec war but there was now a standard underlying all of this and the quality - oh my the quality was getting good. Then the MPEGroup enhanced the h263 codec and called it h264 and oh my this codec with a pure godsend in the media delivery world - it was great looking scaled up to huge resolution could be used online, streaming and on HighDefDVDs and in the beginning it all was pretty much for free.
It looked like an Apple comeback in the webvideo delivery because Quicktime was for a while the only container that could play H264 without problems. Around that time flash started to include a function in its webplugin to play video - interestingly enough they choosed to include sorenson video as the only supported codec - word on the street was that Sorenson was very unhappy with Apples decision to ditch them as codec of choice and instead pushed H263/H264. Now the story could have ended with Apple winning the format war right there and all computers would have quicktime installed by default but it didn´t because out of nowhere Youtube emerged and Youtube used flash and Youtube scaled big time and made it - for the first time ever - really easy for Joe the Plumber and anybody else to upload a video to the web and share it with the rest of the world family. It changed the landscape in less then 6 month (I watched it it was crazy). Now you had a really good codec finally as a content producer to upload video in very good quality but the world choosed the worse quality inside a player that sucked up 90% processing power with the codec of choise needing another 90% and all that came out was shit looking video that everybody was happy to be over - but the user experience of hitting an upload button and have everybody watch your video was just unbeatable. Eventually just when people realised how bad these videos looked compared to some Hollywood trailers that still used quicktime and H264 Adobe included H264 into flash and prolonged their death by doing so again (without innovating at all it must be said).
Now fast forward to now - again a group of clever people, big companies and such have sat down to bring us HTML5 and the video tag. That tag as said is going to rid us from any plugins and containers and instead just plays pure video as fast as possible right inside the browser that supports HTML5. Now the problem is that people can not agree on the codec to be used. Why you ask if H264 is so great? Because H264 was developed by a for profit group of people and they want to make money and they have freaking patents on it - not that this has hampered web video to this day in any way - but for the future standard it seems lots of people have taking offense to that. There is in fact a whole ecosystem of alternative codecs (audio and video) in the open source world and the most prominent is Ogg and its video incarnation Theora. They are mostly patent free because the company who originally developed these codecs gave the patents to the open source community (yet its still not clear if that covers the whole codec). Now what happens when patents enter the WorldWideWeb could be seen with GIFs. The GIF graphics (moving or non moving) where once a cornerstone of the web - a more popular choice for graphics then anything else (small could be read by anything blahblah) then a company found out that they had the patents on that (luckily just shortly before they run out) and sued a lot of big websites for patent infringement and wanted to have royalties of $5000 from every website that used GIFs - they would have killed the web with that move (and they where in the right - law wise) if the big companies they sued first would have dragged out the court case until the patents run out - now the GIF file format is in public domain.
Now its understandable that this lesson should be learned BUT and here is

my take on the codec war:

Flash is a MUCH bigger threat then patents on the codecs used. Because not only does it use the patent infringed codec inside its container but the container is totally owned by one company and a company that has shown often (PDF) that it will do everything to take control of anybody using their technology - even if it is the whole world.
Now 95% of all web videos are delivered by flash these days and to change that a lot of things need to happen. First google needs to drop it on youtube - they just announced a beta which does just that - but even with googles might its just not enough - content producers need to hop on board as well. And here is where the chain breaks for the "free and open codecs of OGG". See from the history above H264 has been the industry standard on a wide range of devices including the web for years now. The whole backend has settled on this and there are really good workflows to create H264 video. With the video tag - Youtube is less relevant then it was at its beginning because all of the sudden its easy to incorporate video into your webpage. Now if google where to say "we use Theora only" high quality content producers would just say "fuck you" and post their videos on their own sites in a much better quality without the hassle to find any workflow to produce theora videos (for the non terminal using people there just isn´t an easy way to do that still that can be used in a professional non frickly environment - we like to create not code for a delivery sorry).
But thats not enough - almost ALL consumer cameras released over the last 2 years including the hot shit DSLRs with video functionality produce H264 that can be "just" uploaded to the web without reencoding - thats saves Youtube and vimeo a lot of processing capabilities - and with their lossy revenues they sure don´t want to add another server farm just to reencode every and all video they have to a codec that has worse quality. You know 90% of all videos on the web are already encoded in H264 as of now (and Theora maybe has 0.2% of the other 10% that are left over). Its uneconomical and not sensible to re-encode all of that any way you look at it - especially since the quality is not surpassed by any other codec out there - patent free or not.
I would say go H264 now and have a new consortioum of browser developers and other companies develop a new codec (or build upon theora) from scratch that is patent free AND high quality AND has a good workflow (means is supported by hardware vendors and OS vendors across the board). That can then take over H264 (just like PNG took over GIF in less then 2 years following the patent threat). Leave the codec question open for now and let the web sort it out for itself (for the moment) - like in the img tag - doesn´t matter if you put PNGs, GIFs or JPGs in it (or any of the other plethora of IMG formats) as long as the browsers support it its watchable and so far has shaken out a good road to take (see the switch to PNG with transparencies that also helped a lot to bring down IE5 in my opinion which didn´t fully support that - so the market sorted it out quickly (as in 5 years quickly)).
BTW the only browser that just does not want to go down that route (and rather wants to cripple itself with Flash in the meantime) is the oh so open minded Firefox. Sorry I fail to see your point dear Mozilla developers - you are not gonna make a lot of friends that way outside of the very very small open source community (and even here your approach is not liked universally by those who can not install a flash plugin for example because flash is not supported on their platform (ppc linux f.e.).

Get rid of Flash first then get rid of H264 later when you have something equally good on all accounts. Going backward with technology is just never the way forward - open source or not.

24.03.09

3d in the browser - is it really finally coming?

VRML was once to be said the future of the web - everyone who ever tried that out back in the good days will agree with me that it was deemed to failure right from the beginning on. It went under and was never seen again with the second generation browsers. Modern browsers had other stuff to worry about - like passing acidic tests and such so 3d was not a main concern ever. Now word from the Game Developer Conference hits the street that the Kronos group is working together with the Mozilla foundation to bring accelerated 3d graphics inside the browser window. The Kronos group is responsible for OpenGL and OpenGL ES (iPhone is all I say here) and Mozilla of course for the Firefox. They formed an "accelerated 3d on the web working group" that will create a roalty free standard for browser makers to implement and webdevelopers to use. Hallejulia - now it might take some eons for a) a standard to form b) browser to adopt the standard c) 3d program letting you export stuff in the right format but the prospects for real 3d in the browser in a 3-5 year time frame are exiting to say the least. Personally for me this is bigger then vector (as it includes vector hopefully) - the possibilities are endless and truly exiting. Be sure to hear back from me if there is the earliest inclination of any beta or even alpha warez to try this out.

via internetnews.com

1.03.09

Bruce Sterling Assay about the future of the web.

A fantastic powerfull transcript of a speech by futurist and SciFi writer Bruce Sterling at the Webstock09 conference. He talks about the transition of the web 1.0 to 2.0 and how that is now transforming to something completely new - his bet "ubiquity" - being networked anywhere anytime. He says thats its a rollercoaster drive we are all boarded now and that the future will look different but will be what we make it. And he couldn´t hit it home any better:

We've got a web built on top of a collapsed economy. THAT's the black hole at the center of the solar system now. There's gonna be a Transition Web. Your economic system collapses: Eastern Europe, Russia, the Transition Economy, that bracing experience is for everybody now. Except it's not Communism transitioning toward capitalism. It's the whole world into transition toward something we don't even have proper words for.

The Web has always had an awkward relationship with business. Web 2.0 was a business model. The Transition Web is a culture model. If it's gonna work, it's got to replace things that we used to pay for with things that we just plain use.

In the Transition Web, if you're monetizable, it means that you get attacked. You gotta squeeze a penny out of every pixel because the owners are broke. But if you do that to your users, they will vaporize, because they're broke too, just like you; of course they're gonna migrate to stuff that's free.

But you know, I'm not scared by any of this. I regret the suffering, I know it’s big trouble -- but it promises massive change and a massive change was inevitable. The way we ran the world was wrong. I've never seen so much panic around me, but panic is the last thing on my mind. My mood is eager impatience.

via boingboing

25.02.09

60 sites with free 3d models - 150+ sites with videotools

Via twitter come two really valuable links that I think are in the scope of myself very much. One has links and explanations to 150+ pages of video tools that you can use online - it ranges from video sharing, editing, streaming to commenting and hoarding. The other is personally even more useful - it contains 60 sites that sport free as in beer 3d models. Countless of times I needed a very generic object just to populate a scene and have looked on the three lousy sites known to me without luck.

150+ online videotools and resources

60 free 3d object sites

via @dollars5 & @LisaTorres.

12.02.09

YouTube goes CC and Micropayment!

Google has announced that YouTube will let users include Creative Commons Licenses and make files available for download either free or for a "small" fee through GoogleCheckout. That is a killer move in a lot of ways. First its great publicity for CC of course - so they reallyreally have to figure out how to a) enforce their license better b) find a final solution to the pesky embed problem - especially with YouTube.
That it might finally make micropayments possible is the other side - not only possible but enforce them. Wonder if their will be "GoogleCredits" soon - where you buy a bunch for $10 and then pay with them - basically Google making its own currency - at the moment they can´t really make profit with sub $1 transactions (or they have figured out a way to do it). Interesting to watch progress on this front - even so its evil angel google.

Here is the official blogpost.

9.02.09

Time spend for corporate owned social sites = lost time

I have written about this before and I know I had been right then but I had no real data to back it up. The topic is that if you spend time on social networks or other datacollecting enteties it is very much time better spend somewhere else. You are bound to loose your data and your time at one point or it gets sold to a company that you might be inclined not to support or the company you put all your stuff in turns evil.

But here two examples of late.
First my favorite bookmarking site ma.gnolia.com went down the tubes - they had a unrecoverable data loss - meaning all people spending time there putting their bookmarks up, commenting on them making lovely tags and sharing them in groups with like - minded people have lost everything they did - it happened over night without forewarning. Technical glitch and everything just gone - now you can say this could happen to everyone - but everyone would maybe cared enough about their own data to make a backup of a backup of a backup (and as it happened I made a backup - the first one ever - just one day before saving my bookmarks from doom). While I was lucky there are a LOT of very pissed people out there who lost everything (one person could recover 80 bookmarks out of 80000!). Now the reason I moved to magnolia in the first place was because there was the first round of talks between microsoft and yahoo - microsoft buying all assets of yahoo and de.licio.us the "first" real bookmarking site was bought up by yahoo - needless to say that the last thing I would want is my link collection (nice datamine) in the hands of microsoft.
This could all be put under anectotal evidence if it wouldn´t be for a second event to happen. Apparently google is deleting blog posts of bloggers on their blogger.com network that talk about certain bands. This seems to come from pressure of the RIAA.

From laweekly:

“I’d received the label’s press releases and followed their directions, spending my time and energy to promote their albums,” explains a frustrated Spaulding. “By pulling down my post, they destroyed my intellectual creativity, the very same thing they’re erroneously accusing me of doing. Say someone had linked to that post, or [blog aggregator] Hype Machine — it’s gone completely. If I go into my Blogger table of contents, it’s gone. Not de-published — gone.”

Now I would be sympathetic with this guy but everybody who does not understand that putting your work and time in other peoples hand makes that content fair game for anything should get their hands off the internet asap. Sadly I see even figures that I admire for their advancement in the net in the past blindly following the herds into all the great web2.0 land where the roses are blue and the sky nicely pink with glittery lickable interfaces on top. I think it has to take one of the big social networking sites to go down (and I predict a bankruptcy of myspace or facebook in the coming 12 month) for people to realise that all they have done the last 4+ years was putting their time and energy into the hands of some megacorps that in turn sold their datamines and gave a shit about the actual content published.

The only exception I see here are volatile services that don´t preserve data well in the first place. The glorified chat blabber at Twitter might be one of those services where a loss is not the biggest problem except if you really care about the follower number that you tried to build so hard over the years that could also vanish. Generally it would be very healthy if tribes would find each other and make innertribe services that offer these - often quite mundane and easy to copy - corporate controlled services. And then develop an open in all senses of the word protocol where all these individual services could talk together (web 3.0 proposes the underlying architecture for that by making all content truly portable with nice xml semantic file sent along explaining everything about the data that can be expressed). This would take away a lot of freedom of speech pressure from the net itself because a highly linked but de-central aproach would not be so easy to attack (as in the case of the RIAA suing google f.e. and having a big hit with it because it controls so much data - imagine them having to sue 20.000 sites individually). It would also put an end to the "all these services don´t have a bussiness plan" problem as these services are then under the control of the community which might want to fund them too or has enough volunteers to run them on no dime - also infrastructe would not need to be as massive as with a world wide service - taking also load off the world wide net and localizing things (my guess is that most photos taken that are private photos are also just watched in the same country - yet here we go and save the photos on the other side of the planet on an unknown location and each time when grandpa wants to look at them they are generating traffic around the globe).
There are lots of more positive points to make an re-open the internet and make it tribe based. Otherwise we will all have a facebook Operating System soon because the majority of people wants that (just look at the "task bar" to see that they are working in that direction very clearly).

31.01.09

Kill the Flash

clicktoflashEngadget.pngclicktoflashSpiegel.pngMy disdain for Flash has been well documented here on this blog. It the most ugly pervasive piece of software bugging the universe since its introduction and it gets more and more ugly with each iteration - especially on the mac. That ugliness is not only shown in the creation environment - sporting an interface that is only usable for people who are total masochists but also in its exclusiveness and non-openess of the resulting content (closed containers of anything on the web need to just simply go to hell) riddled with patents all throughout surpressing any healthy developer community that is not just in it for the fame of making the worst usable interface but its plugin iteration installed on million of mac is also DOG FUCKING SLOW. Next to the horrible Skype its the only other program that can make my computer feel like its trying to render a 3d animation on 1992 hardware - and its just rendering freaking vectors and a couple of mostly low res pixels - cross checking with some advanced compositing applications has revealed that the drawing code in Flash must have some extra loops in it to make it extra super slow and trying to make the power companies rich by trying to make the computer work harder and therefore draw more power then it needs to.
Flash 10 makes matters WAY worse - besides the press release copying outlets (MacTechnews, Golem f.e.) claiming otherwise. While with Flash 9 you could possibly open 10-15 movies side by side without crashing your browser or bringing everything to a complete halt (smoothly playing seems only possible with max of 3 video res movies at the same time) Flash 10 does barely one with 80% processor utilization - make it two and you are in 10-15 frames per second land. Well it just speeds up Flashs total selfdestruction and might speed up ramming through HTML5 (video/audio embed tags) and CSS3(animation) and SVG (vectors vectors vectors) in the end.
In the meantime I saw Todd sporting a handy small plugin for Firefox that displayed just green squares where there should be Flash content and you could click on it to display the actual content - since I am not using Firefox (too slow on a Mac) but rather use Safari I didn´t bother to ask what the plugin is called but today I stumbled over a similar plugin for Safari. It even works with the yet unreleased WebKit nightly builds (which I run exclusively - I like to be on the edge of things - and unlike Flash - Safari will see a tremendous boost in speed in its next iteration which everyone can already enjoy using a nightly build of Webkit). So for getting rid of Flash without getting totally rid of it - because well people like putting Videos in a container of a container of a browser of a OS - and I like/need to watch movies at points - there is the Safari plugin called ClickToFlash. You get grey boxes with a gradient and a "Tasteful 'Flash' icon now drawn on top of" it (the developer). Click on the box and the flash content loads on demand. I am enourmously happy. And for Webdeveloper still thinking its a good idea to put any flash in menus or other content that is not absolutely requiring flash (no content really is but that is a different issue) - screw yourself - get a live - learn how to use open standards - I will be unable to navigate your site in the future and so do many many more people. A nice side effect btw is that all ads seems to be completely vanished except for pure text google adsense - didn´t even notice that ads are nowadays almost exclusively flash - browsing just got so much more uncluttered, faster and better :)
You can download the very open source code and a prebuild installer (scroll down because that is what you want) here.

20.01.09

Location Based News on the EiPhone

I really like location based services and am intrigued by the possibilities that GPS offers as an interface between the real world and the virtual world (I am obsessed with interfaces that bridge reality and virtuality) so naturally a location based news service is clearly in the "how cool" column of my thinking. I mean just walking down the street in a foreign country and reading the gossip of the neighboorhood, see what parties look like on private blogs photos and reading about the latest murder around the corner (well I could live without that I guess). Radar is an EiPhone app that introduces this kind of service. Its probably a cool idea to start a past time skating around towns collecting news and then read them relaxedly at home. Awesome.

Via Mashable via Eyebeam.

12.01.09

Facebook (& co) the new Microsoft

Lock-In - Megamassmarket - Proprietary - driven by advertisment. That are things that I connect straight to Microsofts Windows - which has no right on the digital planet by what it does - the only way it survives is because it has the successfully locked in a mega mass market with proprietary mechanisms and keeps itself alive through an advertising budget that is larger then the GDP in some African countries. Luckily its a slowly dying dinosaur. Yet we wouldn´t be humans if we wouldn´t fall for the same traps over and over again - not learning a thing about past mistakes and here we go into 2009 and I have recieved 20 facebook spams - ähm friendship requests - already. Now my take on the whole social networking has always been clear - I am not spending my time on something that makes other people rich while there is no apparent value to human kind or even worse it halts innovation and true progress.
I truly believe that facebook and myspace (and the german studiVZ) are clearly aiming to become the new microsoft and want to lock in their users and control the content the users see and make it hard for them to switch - and the most proprietory of them - facebook - the only one who did not join the open social endeavor - seems to be the winner of them all.
Now if you look at a facebook user page (my gf has one to my distress) it already mimics an operating system nicely with "webapps" and a task bar and such. Now what is it offering again? Basically what it is is a glorified chat system with some blogging functionality and a working RSS feed implementation. The only thing its good at is combining these tools and organize the output. It does so by making "friends" the upper category of organization. I don´t understand the concept of "friends" that you are unable to track through your live except with a digital tool - I do sincerely think people that you are unable to track with your own memory are not real friends and might be even too uninteresting in the first place to even be acquaintances. You know there are people I contact or they contact me after I have not "spoken" (in any sense of the word) to them in over 5 years and yet both sides tracked each other over the years because there was more then a hyperlink connecting us.
Now organizational tools are not bad per se. Indeed one could argue the popularity of the social network sites is because people need to organize their input stream somehow. I think its laziness to learn the underlying principles of the web and understand that all the tools to connect to your friends and make them aware of your output are there without facebook et all - the most often reason I hear when I challenge someone why they are using a social network site is "it makes it so easy". I think some of the 2nd tier (facebook becoming the Nr.1) sites going belly up in the next few month and the people see their "friends" disappearing because their content was locked in - all the time they spend on the site that they thought is theirs but isn´t and maybe even some of the content they put up goes into the "wrong" hands and is being used to make ridicule out of them (its not their content anymore anyway) might see an awakening. Until then maybe there is hope to make a decentralized open source open community out of the internet again and not a giant hub of advert sponsored data mining endeavor.

Meanwhile please to all my "friends" out there: Please don´t send me more facebook/mysapce/studiVZ spam - I have enough other spam problems already. I won´t join even if 90% of the world is signed up.

19.12.08

Webdesign - a designers perspective: Introduction

This is going to be a multipart series on my view on the webdesign process in this day and age with nitty gritty details - to much for most casual readers but not enough for the more professional webbers but just about right to those making websites coming from a more design perspective touching each and every part that touches webdesign as a whole - from the artwork to the programming to the content management to the hosting problems to the philosophies, existing standards nobody cares about nonexisting standards everyone seems to care about. Guaranteed not flash bashing free and guaranteed full of sarcasm. Here is the intro to this never ending saga.

About ten to twelve years ago I got into developing my first website - back then I was neither designer nor programmer but had experience in the latter and was about to study the former. I made this website with a lot of javascript and made the mistake of using images as text as I wanted one font throughout the site. The site looked alright (to me ;) and functioned fine but was unmaintainable for a person who wants to do something else with their time then hacking pure HTML. It had one nice feature so - that was swappable images - something you see nowadays a lot in those fancy iFrame galleries - back then I did not encounter another site that had it. Oh javascript that was something I could identify with as it came so close to BASIC programming which I learned while the east was still the east and the west the bad. "You can´t use javascript, its bad practice - nobody should use javascript ever" was all I got for my month long effort of making a webpage. The site was sitting there for a long long time without any updates and got very stale - at one point - right after the only update it ever recieved -the browsers started to break the site and I decided to pull it.
Since then I have been pondering on making a new site - one with a cool content management system that would enable me to easely update the site, one that is very cutting edge and puts technology to a good use. The longer I pondered the more it became clear that I did not want a website for myself - but rather a framework that could be an umbrella for the myriads of my interests and those around me wanting to produce something cool together with me. On the design end I was still learning and trying to find my style my design mojo. The years went by and the website did not really progress other then in my head. There was a quick attempt at it about 3 years ago that had already an interface designed but that never off the ground because I actually had to finish my diploma and all. Then last years pushed by a lot of different things I actually started to take the plunge - and man if I would have known back then what a journey it would become I would have thought twice about going along, but now 35.000 lines of code and about a year later I am extremely happy with the outcome so far.

Next Part in the next days: Choosing the right Content Management System

13.12.08

Germany to become bandwidth heaven?

You can say what you want but sometimes you get officials that are surprisingly bright - not very often but it happens. So I read today that the german government is considering bringing 50 Mbit/s broadband to every household until 2018 - including remote villages. I was about to close the page with the thought that it might be another scam by the german telekom who is already offering 50Mbit/s via VDSL - which is clearly aimed at broadcasting TV through it and makes the recievers the same couch potatoes they have always been. For me any advancement in internet speed that is not bidirectional can stay where it is. How is 50Mbit/s download going to help when the producers of podcast have to wait for 2 hours to upload their stuff through 1Mbit/s upload speed - the only persons who are capable of broadcasting are again those with money those that are the brainwashers the media elite those that the internet can not use at all if it is to strive - those that are to big to fail but are failing.
Then I clicked on an adjustant link and my eyes almost came out:

Die von der Deutschen Telekom derzeit angebotenen 50 MBit/s via VDSL seien "lächerlich", befand die Regulierungsexpertin am Donnerstag bei einer Anhörung über Next Generation Networks (NGN) im Unterausschuss Neue Medien des Bundestags.

Thats the vice president of the "Bundesnetzagentur" or German Network Agency - regulating our communication networks - saying that a 50Mbit/s VDSL network is laughable and absurd. I could not agree more and that this agency is aware of the scam that DSL and even more so VDSL is, bodes very well for the future of the german internet - now if only the companies would start offering SDSL for prices that are affordable (at the moment SDSL prices are in the range of 3x -20x as those of the high end DSL connections - starting around 60 euros a month for 1Mbit/s up/down). Oh and they finally see the light in putting in empty pipes in the ground when its dug open anyway for future networks with for example glasfibre (different story but here in my hometown they ripped out glassfibre 5 years ago after installing it after the wall came down 15 years ago because they did not want to invest in the backend infrastructure to make internet work over fibre - no joke - that is the German Telekom if you don´t know them yet). What a novel idea - too bad that everyone and their grandchildren already had this idea 15 years ago when you dug up everything anyways.
Anyway its not the worst news you can get in the morning especially since you are investigating how to transmit streaming video from a remote location in the past week.

The quote is from an article in Technology Review and is in german.

9.12.08

Javascript Processing Engine

I have voiced a lot of disdain for the closed source proprietary slow resource wasting crap that is flash. Flash is not good for video playback (for a number of reasons everyone but google would admit too) its also quite dumb to jail images in flash containers - but both of those things are a common practice around the net (flickr, youtube) - both things will be a part of an ugly past once HTML5 hits the street and moves toward a 50% usage point. SVG, video, animation and audio tags are just too cool to pass up on as a developer - and its all open and standard and complient and such.
Yet there was one thing where flash was until recently considered the only viable option: browser based games and interaction projects. Well in the offline world there is a programming language called processing that is used to make games and physics simulations and interaction models and and and. Its a breed of its own when it comes to programming languages - quite easy to learn and follow and very powerfull in what you can create but unless you are a java fetishist there was no way to run processing stuff inside the browser.
Now with the new breed of browsers around the corner - those that will support HTML5 and have advanced Javascript engines that support the canvas element - its is possible to use the processing language inside javascript with the javascript processing library. Just check out the demos (you might have to use a nightly build of your favorite browser to see the demos working) and see the potential. I did test them with the latest WebKit nightly and all demos are extremely smooth fluid and amazing. With these there is absolutely no excuse to use flash anymore. I will be surprised to see one major website with any kind of flash content in three years - mark my word. Javascript and HTML5 and SVG are going to be the solution to all your pain very very soon - and as a good side note it has to be said that there will be so much kick ass content that people with internet explorer will want nothing more then to upgrade to a browserto something modern (and I do not think there is any way for MS to catch up to either the firefox or the safari or the chrome javascript engine and make HTML5 and SVG available in any amount of time).

17.11.08

Spam Architecture

spam_architecture.11.jpgAlex Dragulescu is making automatically generated architecture - his data input are spam emails. Keywords in these spam mails are used to generate planes in three dimensional space.

7.11.08

Internet Exploder soon to have WebKit heart?

The world is changing indeed. I have been now working on our website for over a year and at one point I ditched support for Internet Explorer in favor of coding standard compliant XHTML. I wanted a website for the future and not for the past. I am only testing in Firefox but foremost in Safaris Webkit (the developer nightly builds of Safari) because by all that I can see it is just the most standard compliant browser out there - so if stuff looks good in there and validates you have good code. Now I am still looking at browser statistics and pander if ditching IE was such a good decision because its still at 75% browser market. Then along comes a blog entry and the world just get a different face. After Google incorporating WebKit in their Google Browser Microsoft thinks about following along with a future generation browser of their own that uses WebKit. I don´t know if you can understand what that means. But that would put Safaris WebKit in the number one browser spot quite instantly surpassing Firefox. That also means the world is left with only two competiting browsers but both of those try to be standard compliant as hell. (Hey I am not counting Opera - who as the total underdog browser still does not seem it necessary to have a standards compliant rendering engine). Heck Firefox and Safari both push CCS3 out the door and SVG is at least in Webkit working very well to the point where you can start animating in it. That means standard compliant moving vector interfaces with custom fonts - together with the audio and video tags that are coming as well this means not only bye bye Internet Exploder but also bye bye Flash... Oh happy days.... Of course the whole thing is taken with a grain of salt as there is nothing official about it - but one can dream right?

Steve Ballmer said:

"Open source is interesting," he said. "Apple has embraced Webkit and we may look at that, but we will continue to build extensions for IE 8."

31.08.08

SpamWar: The Captcha Lie

I am a fierce opponent of Captchas from the day they have emerged because I understood from day one there is no solution to the spam problem that involves more barriers and more walls - walls never work over the long term. I also believed that if a human can solve a captcha then a computer can too at one point down the road - as so many other spam fighting systems that were sold to the masses as "the perfect solution" have been tricked. Also I hate captchas myself and recently some of them have become undecipherable even for a human. Some want you to even learn some code to decipher it - something I am not willing to do when I just want to post a quick comment or such. Captchas are one step to take the bidirectional out of the internet.
captcha014.jpg
But all that is mood when there is a huge industry that earns its money solving captchas with human power instead of machine power - and that is apparently already happening in India. For $1,50 you get 1000 human solved captchas - that is a lot for a little. If someone wants to flood a captcha "secured" site with spam - and I believe captcha secured sites do not have more barriers as they are captcha believers - then then for $150 they get 100000 accounts - that is probably something that makes them a profit (as the 3% rule of spam states - thats 3000 people buying some overprices viagra or fake rolex watches).

Matt Mullenweg founding developer of wordpress has phrased it greatly:

"Ultimately Captchas are useless for spam because they're designed to tell you if someone is 'human' or not, but not whether something is spam or not. Just because something came from a real human being doesn't mean it isn't spam...."

Read the full article at zdnet its highly interesting.

25.08.08

Comics coming to the iPhone - a medium finds a new host

appestore.jpgmanga007.jpg
Gosh I don´t own one (and unless they are getting smaller and the dataplans cheaper it will probably stay that way) but I love the iPhone for that it shackles up so many traditions and morphes us closer to a 21st century society - also it starts to save trees which is all and about a great thing. How? Well there are starting to appear comics on the iPhone that would have been printed on dead trees beforehand. Given that especially in Japan the throw away cheap comic culture is huge this might even have a measurable environmental impact at one point. But comics coming to the iPhone is even cooler. I have noticed the rise of webcomics lately but I find reading them in a regular browser takes a away a bit of the magic that happens when you dive into a paneled layout without distractions all around it that blink and try to draw you into a different universe - you know what I am talking about if you have ever seriously used a computer. A comic on the iPhone would is different - you have the screen and the comic fills it - yes it would break with the traditional panels on page layout and is instead only serial panels scrolling but you could even mimic the move from panel to panel which sometimes goes left to right sometimes top to bottom and sometime just quirky - depending on the content - by sliding the panel out/in to the corresponding side. You could even give a "page" overview on how the panels would be arranged for every virtual physical page. Quite general I love the idea and I think with the rise of epaper readers and the iPhone and the current attention deficit society comics become a much stronger medium in itself.

Japanese Manga Comes To The iPhone (Mike Cane)
The Future Is Almost Now (Publishers Weekly)

23.08.08

idée TinEye Picture Search Engine actually rocks

There has a plethora of new search engines around the web lately all trying to dethrone google but none has really fully amazed as something that could even compare to google. Yet today I came across TinEye (gosh this reads like an advert - but wait for it - its not its real). It is a pure picture search (no audio, video or text search) and it has a unique but seemingly working take on it

With TinEye you either upload a picture or tell it a webadress of a picture and then it looks for the picture or pictures that look like the picture all over the net. Its database is now at something like half a billion pictures (that 400 Million and something and probably counting upwards by the minute). Well I thought I test so I registered (grumpy bumby why do they all want a phony web adress from me? you really think I am going to give out a real one that I actually use?) and put some photos in personal and not so personal and I got a success rate of 8 out of 10! That means for 10 photos I tried I got 8 that gave me a result that was absolutely spot on. Heck I could even ease my mind with finding out about that rocky impossibly balanced house I posted a couple of weeks ago (its a photoshop ffs. fooled me. from now on I never believe any photo that looks awesome anymore). Anyway the engine is straight forward and works well is incredibly fast - as fast as google so it has to do so much more then google one would think (I think I am having this commercial voice again - sorry but I am quite amazed). Now there is one huge implication this will have -> copyright. If you have copyrighted images on your site and a photographer, artist or whoever who actually enforces their copyrights expect to get nasty letters very soon (that reminds me I still have to write this copyright embed research blog post that has been hovering on my computer for some time).

So here is a video explaining TinEye but try it out yourself I know you love it - its a new way to search (also for yourself if you have lost it ;)

21.07.08

Beautiful Javascript Creations

darknoon_organics.pngEver since I entered the code behind the internet I have been a big fan of Javascript. While there was a time when Javascript use was seen as something only those who had no clue did todays thinking has changed and you can probably not get around the modern web these days with your Javascript turned off and have a pleasent experience. But I always knew that Javascripts capabilities have been greater then just pulling some data from some server and presenting it interactively in a glossy interface (so I am always trying out new stuff look at the moving logo up this site - based on code I wrote in 1997!). Well with the SquirrelFish engine in the nightly build of Safari and Firefox 3 things are getting mighty speedy in the Javascript world and things are possible that seemed impossible before. Like recursive mathematics f.e. I stumbled upon a site that makes recursive beautiful organic pictures totally with javascript and totally inside your browser - no plugins needed - all open standard nice Javascript. I am genuinely exited.

So download WebKit Nightly (the developer version of Safari) and start tinkering with the following site:

http://darknoon.com/organics/render.html">

5.07.08

identi.ca / laconica open source twitter

Hmmm.... Twitter or "microblogging" is a very strange technology that I dismissed then discovered then dismissed again with the thought that it might be useful for short announcement for bigger project that involve a lot of people with little time. Then Twitter is a freaking company again that controls my dataflow and we all know how I stand with other companies/groups controlling my dataflow and time and energy. So I was delighted to read today that there is now an opensource microblogging tool called laconica and a twitter like service that builds upon that tool called identi.ca.
The tool is heavely in beta and requires the absolute newest php (5.2.1) and the pearl library (which my isp won´t install because of security reasons) and it has no automated database creation script yet - which means you must know how to create a database table by yourself. All this is a bit over my head at the moment but maybe someone integrates that code into drupal and I can integrate it then in our bigger site (note to myself - drupal probably can do stuff like that already).
Anyway moving the datastreams of the world into OpenSource Creative Commons open protocal territory is a very good thing.

http://identi.ca - a service based on laconica
http://laconi.ca/Main/HomePage - the locanica wiki with downloadable source code

27.06.08

ICANN makes the DNS flat - for the rich that is

ICANN has approved the relaxation of the top level domain rules. That is normally a very welcome development because it means we could finally get rid of the stupid .com .de levels and everyone can register a .whatever. In theory that would be a great way to level the playing field and reset the internet sort off. In reality so what ICANN has done is sell the internet to the 1Mio+ corporations because a new top level domain under these new rules costs 100.000 dollar upwards (not quite clear if that is per year or a general fee) meaning that only the companies or persons with a 100.000 grant to spare can register their own TLD (which still needs to be approved by ICANN). Now that means for the first time since I am using the internet I can not afford a basic service - one that is poised to become a "trademark" for success. The playing field has just tilted heavily toward big corporate greed - this rule change has "lobby" written all over - after the big companies failed to make sure that their trademarks are "protected" with the TLDs as they are now - they ask for becoming an exclusive club of the rich owning a very special TLD is such a strong marketing standpoint that NOT doing for a company would be likely branding suicide - small companies, startups people with innovation will now either use a lot of their startup capital to get the branding benefits of a TLD or just suck it up and use the "old ones" which they will become known not too much in the distant future and everyone will see which company has "made it" and which hasn´t - this is a really sucky development and will haunt the net for the years to come. ICANN either you make the DNS flat or you don´t - the new state is on par with loosing net neutrality as it creates a two class internet society in the long run.

16.06.08

Sprout Core - Objective J - 280 slides

Sorry to bore the ones not so into Web/OSX technologies but the news trickling in after the WWDC this year are FAR more revolutionary then any iPhone tech ever was (in my not so humble opinion that is of course). Today I learned about sprout core - an Objective C framework for webapplications. Well I am coding a webapplication since roughly six month now and I am quite deep into the matter. For a webdeveloper you have the following options:
Go the PHP/Perl Javascript route - the most open most supported most ugly most cumbersome most diy route there is.

Go the $insertLanguageofChoice on Rails or on other drugs with a 10MB Javascript framework that works 10% of the time route you do it once and in 4 years the on rails hype has ceeded and your $insertLanguageOfChoice is not supported in a webserver anymore and your javscript framework has so many bugs incompatibilities with the code you have written 4 years ago and you are forced to upgrade because the version you where using has about 1000000000 security holes.

Go the Silverlight/AIR/Flash route - very good controlling how your output looks like in the end - quite easy to learn and follow through - total lockin with one or two companies that could get no worse already (you would think)

Apparently there is soon another route.
SproutCore is an Javascript framework developed by Charles Jolley. Mr. Jolley enjoys a position at Apple working for the .Mac team nowadays. Sprout Core has native drag and drop and a full Model View Controller like Rails. The development on SproutCore is very much like Cocoa development on the mac. You can build complex applications without doing much. Now tie this together with some other developments at Apple like the gene engineered SquirrelFish Javscript engine that by my own account is about 3000% faster then anything out there (purely subjective me running my own stuff on Safari4 without NDA) + add a bit of WebObjects love - the backend webtech that is actually reallyreally awesome just was never usable due to interface problems and years of neglect (they did pick up development a couple of years ago, made it free and I even think partially open source and included - wait for it - javascript support) you are getting somewhere where you might have the best of all worlds - lets hope if it works out like this it is open as hell.

Now this is not just a rumor or some rambling - no I have some real juice to proove how awesome something like this could be.

I stumbled accidently over the site 280slides.com. And instantely thought "wow here something else is going on". What the site is or does is basically the Apple Keynote application on the web. I see you all shudder in despair "WEBAPP"?!?!. Yes Webapp - but at least in Safari 4 I can see no speed difference with this "webapp" and keynote on the same computer - I would even venture to say it feels almost snappier. And the interface is as close as you can get without floating palletes and the stuff. Yes you can change fonts on the fly type stuff inside the slides and present it share it on the web. When I originally stumbled upon it I thought "Flash - this must be done in flash". I looked at the source code which is an astounishing 50 lines long and does not have any flash files in it - the most important line was the following:

 script type="text/javascript" charset="utf-8" 
src="Frameworks/Objective-J/Objective-J.js"

oha. Wow. Now go try the webapp - use the webkit nightly build to get the squirellfish javascript engine. Its a revelation - its a bend in the space time continuum - you will witness computer history - and I am not exaggerating.


Roughly drafted Daniel Eran Dilger has more indepth information on the whole matter and he comes from more of a programmer site so he has the real juice if you like more...

Little Update: I should do more research before I post - so apparently the 280slides is not based on sprout core but on Objective-J a framework by 280North - three guys - two of them former apple employees. I still think that this might just be another name for the same thing. For the real sproutcore framework presentation visit a .mac picture gallery (soon mobileMe gallery) - like this one (no that not mine - I never put any pics on the web)... So call it what you want Objective-J SproutCore or a mix of it - it surely seems powerfull as hell and the apps its churning out are fast, pretty and beautiful.

28.04.08

Here Comes Everybody: The Power of Organizing Without Organizations

Clay Shirky just said that the death of Soap Operas/Sitcoms on TV is freeing up 2.000 wikipedia worth of contribution time... Soap Operas are of course just as on thing of the epic battle between the distraction media vs. the contribution media. I still found the original comment (boingboing.net) a bit "lush" so I looked at a talk from him and its a very refreshing organized view on things that I deep down understand but couldn´t communicate. Here are some Quotes from the talk:

- "Social Lag"
- "Social Capability has not transformed society at anything like the rate as other applications have launched "
- "Groups are natively conservative"

- "It is curiously the moment when Technology becomes boring that the social effects become interesting"

The 4 step ladder of which a participatory internet society is going through...
"1. Sharing, 2.Conversation, 3.Collaboration, 4.Collective Action"

We seems to be past 1. 2. and 3. and apparently approaching number 4 fast. Now with 2.000 wikipedias worth of time freed up the possibilities of collective action that transforms us into a new kind of culture - a free open source culture perhaps - are imminent? This guy thinks so. Listen to his catchy fast worded speech on the Harvard - Berkman Center for Internet and Society.

26.04.08

Semantic Web and my problem with it

Today I got the third invitation for a new "Web 3.0" applications "private" "beta". This one was for a service called twine and is basically a way to organize digital stuff - as wide as you want to describe stuff.
It has made me thinking why I am reluctant to spend more then 20 minutes with these services, why I don´t have a flickr account, why am not contributing to wikipedia and I have a conclusion that any of these companies should take to heart because I sincerely believe that I might not be the only one with this thought.
In a time where time seems to be the most important resource a person can posses I want to spend time so it benefits my future - and with that the future of the planet as well (because without it I wouldn´t have one). I love collective wisdom (as in wikipedia) I benefit greatly from it every day, but I have also been around the net long enough to know that spending hundreds of hours on forums, wikis etc. is a lost cause if this information you give out does not stay in your control. I don´t mean that I want to control the flow of this information, also I don´t want to control its death but I want to stay in control to keep it alive. I have been using the net in some form or another for about 13+ years now and I have seen a lot of the information I gave to it disappear forever into some unknown electric universe. Now companies are craving to organize not just my words but my pictures, videos, pfd documents - basically my knowledge - so that others can access it and find it easily - the so called web 3.0 or the semantic web. I applaud that thought as it will ultimately lead to a greater collective wisdom, but I also have shivers down my spine when I think that in the future I collect all my information on somebody elses machines. As I have just layed out companies on the web tend to diy sudden death, or swallowed up by corporation that have ill intends (yahoo->microsoft f.e. where delicious hosts tons of my precious links I collected over the years) or a change of leadership makes the wrong decisions, or they have a failed backup plan and all data is lost (just happened with a big internet provider in the US who lost mails from about a million mail accounts). Its just too much trust I would have to give out my most precious resource - time - to someone I don´t know, someone who could morph into someone else, someone who could become evil that I don´t want to be affiliated with. So I don´t think the solution for me is called twine or powerlabs or whatever - I think the solutions has to look different.
I want to have control over the longlevity of my information - yet I still like the idea of a collective organization of all knowledge in the world. So I see a decentralized structure of knowledge sharing as the better way forward. You know all that is needed is a standard that would put all the knowledge on my servers out in the open - to be freely harvested by semantic web engines. I can then keep my internal organizational structure, use tags or hierarchies or groups or whatever to let the outside world know what these documents mean and then the web 3.0 companies can just take this and put it in pretty easy to use interfaces and connect with information with other people. If one of those companies goes bust you still have not only the information but also the organization of this information and a different service can come along and use it all. Its sort of like the Open Social platform that tries to do exactly this with social networking (again I want to be in control to whom the time goes that I spend without a lock in and without a feeling that I loose everything).
So for Web 3.0 to really kick off you need something like an OpenSemantics framework that can be implemented into just about any information collecting tool out there (blogs, wikis, even forums). Something that helps you tag, organize, auto-organize your information and makes it available to others out there. It has to be an open source standard that is extendible to a lot of information carriage formats and probably it has to be pushed by something like the W3C to give it enough traction to gain any footing against these thousand of new "web 3.0" startups.
Until then I refuse to give out more time to any of these things no matter how many "private beta" invites they send me.

7.03.08

Apple says "Flash Video is Performance Killer"

OMG. Its so relieving that I am not the only person on the planet who thinks that flash video is the most stupid invention since - well - proprietary internet plug ins in general. Apples Steve Jobs said in its press conference yesterday:

" the technology doesn't meet his company's performance standards for video."

That is great to hear and I can fully agree with that. I can not see why a codec(1) in a container format(2) in an interactive container(3) in a plug in(4) in a browser(5) on an operating system(6) is any good idea (thats 6 levels of abstraction! six levels of security problems, six levels of performance problems and that with something that is already the second most performance hungry technology for consumers). And it seems that Apple agrees on the point - now the only thing Apple needs to do is to bring Quicktime into the 21st century - because after all - quicktime still sucks (so not as bad anymore as it used to) when it comes to creating inbrowser motion webpages that have custom controls. But at least apple is going the javascript route with their integration and therefore the open standards route - as they do with open MPG4 containers (that can be played by anything not just quicktime). It feels very good that I am in good company with my thoughts on flash video and that I have made right choices in the past for the future :)


7.02.08

Apple Quicktime: documentation for DOM events

Well I have been eying the use of Quicktime together with javascript for quite some time now but was very afraid to actually shun the build in controller in favor of a custom designed controller. This is not a problem anymore. Apple as of today (coincidence that I went over there) has published the DOM events to attach event listeners to the quicktime plugin and therefore make it possible without too much hackery to take over control of the movie. It seems Apple fears the flash competition more then ever when it comes to online video distribution and this step - adding eventListeners (and documenting them) - will get them a bit of webcredibility again - especially since now you can control you embedded quicktime object better then a flash movie through javascript (standards nonproprietary scripting language etc etc). Together with the CCS3 animation properties (soon to be standards, nonproprietary) introduced in the latest build of safari and the only superb SVG (standard, nonproprietary ) support in Safari Apple is posed to be a real threat to Flash again (and by now it must be clear that I do not like Flash in any flavor).

Get the Quicktime DOM Events documentation here.

5.02.08

Session Variables without cookies

I do not like code from untrusted sources reside on my harddrive so I dislike cookies in all its forms and flavors, yet if you want to do clever javascripting for your website you need to hold data in memory that transports across pages and does not necessarily needs to get stored on the server. A clever hack uses the window.name to store up to 2 MB of data - heck safari and firefox can even hold up to 64MB of data in that field (after which they crash - security problem? Opera has a 2MB limit set). The window.name is kept as long as you have the same window open so across page loads - just what you want if you need transistions of colors from one page to another for example. No word if Internet Exploder 6 is supporting it (I guess yes it should, but probably crashes after 100kb and you can only save M$ approving data ;)

read more about this cookie killing hack here: Session variables without cookies


4.02.08

Microsofts Yahoo takeover bid - Flickr Users are revolting

As predicted on my first article on the subject, Flickr users are less then happy about a Microsoft takeover and have started the microsoft-keep-your-evil-grubby-hands-off-our-flickr pool.
But the more I think about it the more I see that this takeover is extremely vital to Microsoft and that it will go through no matter what because the offered money is to much to reject it with good faith on Yahoos part. Its not only about the advertising money coming from search engines, not only the very vocal on the frontline user groups of Flicker, delicious and Yahoo Groups it also the access to dynamic web services behind those groups and a way to push nonstandard microsoft standards down the internet users throat and - and I think this is much more a major deal then anything else - push Silverlight into the internet limelight and take on Adobe and Google at the same go - thats something Microsoft would be just too happy to accomplish.
Why? Because Flickr was the defining moment in internet history that got Flash out of the "don´t make any more stupid moving interfaces that take up the users time" into the "Flash is probably the only viable means to make interactive Webapplications with multimedia content". Its why Macromedia just a year later got bought by Adobe and its why YouTube came into existence in the first place. Now microsoft has been developing this "flash killer" that nobody has used anywhere on the web - if Flickr was to use Silverlight that would give Silverlight a proper boost and put it on the agenda of webdevelopers and Microsoft would not have sunk another project that was geared toward the web - or so is the management thinking probably. That this is all faulty thinking and that the only thing that would happen if Microsoft started pushing Silverlight onto Flickr is that Flickr would soon die - but thats not stopping Microsoft from trying - watch it.
I actually HOPE this is happening as this would definately make the animated vector graphics on the web race open again and put in SVG as a viable nonproprietary option much sooner then when it would need to just compete with flash on its own (no media attention no adoption). As said this is gonna be the grant internet soap opera for the weeks, month years to come - get a tea sit back relax. Its putting fire under google, its putting fire under adobe its putting fire under microsoft (trying to make something out of it) and the web has a chance to overcome old things (like Flickr itself) and create something new and hopefully more open more nonproprietary and better leaner meaner in the meantime.

The four cable internet outage and Iran

IranInternetOutage.png
When I heard about the two major internet backbone carrying cables cut on the coast of Egypt I was already a bit on alert. It has not happened before that two cables were severed in one accident ever before. Since I have no clue how close the cables are layed together on the ocean ground I thought it could have been an accident. When there was reporting on a 3rd cable cut on the coast of France I believed nothing anymore. It has never happened before that three major internet backbone cables have been cut at the same time by an accident - nonetheless by the same type of accident (an anchor of a boat drags through the cables and cuts them in half). When there was report that a fourth backbone had been taken off the grid and the affected region was again similar (middle east) I thought to myself that there is something more at play here. First reports had the fourth cable also cut by an anchor - in the Persian Gulf on the coast of Quatar.
Then yesterday there came in reports of internet outages numbers. And while all the mainstream media reported that India had a 40% network loss, Egypt a 50% not a single news organisation actually reported that the only country with a 100% network loss is Iran. Thats right after the fourth cut cable Iran is now an internet island with not outside connection to the world until the cables are fixed.
Now today come i n reports that the Quatar cable is actually intact but that a power outage caused the trouble there - now isn´t Quatar the command center for the US of As army operation in the middle east?
What about the cables in Egypt? Well the radar records of the region at the time of the cutting show not a single ship in the area and the Egypt authorities say that the area is a nautical no go area - mainly because of its important infrastructure. So no ship equals no anchors equals something has cut them that was undersea not visible to radar.

This all comes at the same time the US of A is conducting an Cyber Attack exercise that is also geared against all evil bloggers trying to tell the truth.
So now I find it strange that exactly the four cables had accidents that connected Iran with the rest of the world. Don´t you?

1.02.08

Microsoft wants Yahoo - what you should know

Its official - after about 3 years in the rumor mill Microsoft made public today that they wanna swallow yahoo for some whopping 44.6 billion USD - Yahoo will probably go along as it looks like as of today. That a lot of cash but what Macrosoft does not see yet is the user exodus this will likely cause. For me the rumor has come 3 years ago almost exactly on the day I wanted to start using Flickr. This rumor back then was enough to reconsider putting my time and energy into that service - and my intellectual property because I read the fine print for Flickr and I did not like it (using all pictures for their own purpose whenever they want wherever they want - don´t know if that changed over the years). Now Macrosoft will have the rights that Yahoo had and therefore the rights to Flickr. I don´t know if there are any similar services out there but I am sure if there is even one that offers similar functionality as Flick user will start swarming away.

But the bug ain´t stopping there. del.icio.us is also part of Yahoo. Now I have extensive amounts of links on del.icio.us. but during my research for the podcast I stumbled across Ma.gnolia.com and guess what? It imports your del.icio.us links fully intact looks more pretty has more functions is easier to use has more possiblities to sort - its just A LOT better then del.ico.us - I switched in a heartbeat and it was perfect timing as it seems - no one knows how long you are allowed to export your del.icio.us bookmarks ones M$ ownz it all - and mostly reads it all as well and stores it and judges them - any pornlinks, warezlinks and similar darknet links to be removed very soon so they are clean MSN search compliant.

Then there is Yahoo Groups - one of the biggest mailinglist services out there. Again its one of the bussinesses Yahoo bought in over the years (and merged it with their own unsuccessful version). My old University runs their official mailinglist on there (something I initiated way way way back when nobody even knew what a mailing was) - well since mailinglists are not used that much anymore this is not a big loss, but a lot of communities might still want to reconsider if they want to open up their communication to microsoft who likes to work with governments.

Oh and then there is yahoo mail - expect your email address if you have on there to end as "msn.com" soon - no its not gonna happen but then again how would you feel if Microsoft has direct access to all your mail? Googlemail just looks sooo much more attractive.

I think M$ partially wants the groups that yahoo has created - lots of groups with diverse interests, groups that make art, collect links and just talk about diverse things. They also want the 10% search market share that yahoo has left (has anyone really ever used Yahoo search? I have and it sucked) in general I think Microsoft does not do itself nor Yahoo nor the internet a favor if it goes ahead (that still an incredible amount of money even for Microsoft) but since when does M$ think ahead? Generally its a grand Soap Opera unfolding in front of the internet users eyes and I am pretty sure like 10 years down the road it might end in big crocodile tears for Microsoft....

A bit more on the topic on Saturday in the radioPrototypen podcast (will be in german so)...

23.01.08

InternetExploder 8 -> We have THREE non standard complient modes now

Microsoft seems to be on the path to figuring out how to scare away the last two content developers they still have left. Instead of going the IE7 route of a standard compliant and a quirks mode they add a standard compliant quirks mode that has to be enabled on all websites and only then it renders the ACID test as reported earlier (or any standard compliant site as it should). Not only was it already not so pleasant to learn and understand how to make IE7 render pages that IE6 would not but not use IE6 hacks to do so in a way that didn´t really work in the first place -
no - Microsoft "invents" another totally new metatag and expects everyone to go over their pages and include it. Really what the freak are they smoking? Break the fucking backwards compatibility and finally make everyone just abandon IE6 - its a freaking dead horse that was born in the internet dark ages - and don´t pull along the stupid non working code - I am totally utterly sick of it. I hope this will further alienate developers from IE to the point where IE has MUCH less market share then the other browsers that are actually standard compliant from the get go and don´t need no freaking quirks mode to operate properly. The metatag is designed to be able to destinct .0x releases of all browser so that designers can create pages for every .0x release of every fucking browser out there or what?

Read more about the "new" metatag at A List Apart


9.01.08

Internet Graffiti: W3C.org spamhacked?

W3Corgspamhack.pngThis starts to become an ongoing series I guess since the space.com hack two weeks ago I come across another major hacked site - this time its much more subtle BUT its also a MUCH bigger fish.
I am talking about the CSS3 specification blog entry on the official W3C.org - that is one of the most important sites on the net that hosts all interweb standard document.
The thing that makes this hack so scary is that it is NOT a standard "defacement" as the space.com hack where everyone just knows what is going on but a subtle "I paste two lines of text and a link inside the normal document". If the guy would have done this in english instead of spanish I would maybe have not noticed it. Its mostly a harmless "poker spam" BUT there is someone out there who can change text on the official W3C.org site - means this person can also change specifications. Every webdesigner, browser programmer and anybody else who wants to make something standard conform on the web visits W3C.org and just a small error on one of the pages can replicate and threaten to kill a lot of man hours of work around the world. I find it pretty serious - lets see how long it takes this story to float to the top.

the link to the affected article is here:

http://www.w3.org/Style/CSS/Planet/


Update: I just got the following reply:

Thanks for the heads up. It appears W3C wasn't hacked, but CSS3.info
was. I've forwarded your message to the CSS3.info guys; if they don't
fix it soon, I'll temporarily suspend them from the Planet feed.

~fantasai

5.01.08

With this kind of attitude Firefox is on the path to failure

I stumbled upon one of the most annoying bugs ever in Firefox and had to wade through a list of comments just to find out that there is no bugfix for Firefox1 and Firefox2 but there might be one coming for FF3 whenever that is released. This is a serious interface bug that makes the use of forms almost unusable as the cursor (caret) just does never appear when you have a certain kind of relative/absolute positioned layout with divs overlapping. For my scenario none of the suggested workarounds work. As said in an earlier post I am not recoding things just to work in certain browsers if the code is valid and verified. What really pisses me off is the arrogance in the Firefox team toward this specific bug - which I see as deep red serious - a bug that has been first reported almost two years ago and has consequently been neglected. Read this comment from a FF developer to see what I mean. If you have people posting over and over that a bug is a BIG problem for A LOT OF PEOPLE the last thing you want to do is piss them off more - especially if your browser has stiff competition and this is a obvious bug that has not been fixed for more then two years and you are not even planning to backport the fix to an earlier version of your browser.


Ryan VanderMeulen 2007-11-06 08:32:43 PST
What about comment #100 is unclear? The fix is on the trunk and relies heavily
on other changes which were made on the trunk as well in order to work.
Backporting it to Fx2 isn't going to happen because it would require way too
much developer energy and be extremely regression prone. Plain and simple, it's
not worth the resources when Fx3 is already nearing its first beta release.

I know you guys aren't happy with the situation, but that's the way it is. The
fix is in what will eventually become Fx3.

In the future, please follow the etiquette guidelines before deciding to spam
bugs with yet another "me too" comment and forcing everybody who's CCed to it
to read your thoughts on the matter. I assure you that most of us on the CC
list really don't want to hear them.
https://bugzilla.mozilla.org/page.cgi?id=etiquette.html

Its is freaking worth the resources as this keeps webdevelopers happy and recommend your browser - this is no small bug and the error is clearly as not seeing it as a serious bug while it is and has been seen by the countless of comment saying how important this is. Pointing me to an etiquette is more insult to the wound. I leave the site as it is and don´t care if it shows up correctly - meanwhile I am not recommending Firefox anymore - use Safari on Windows and Mac or Konquerer on Linux to see a webpage like it is supposed to look like.

23.12.07

Space.com hacked by Body

hackedByBody.pngJust stumbled across the space.com website and find it hacked. Funny guys the script kiddies arabian 1337 h4x0rz,

20.12.07

Internet Explorer 8 allegedly passes ACID2 test.

ACID2.pngIt seems bitching on a blogs comments has for once done something - or so the official Internet Exploder Blog claims. The new upcoming IE8 passes allegedly the famous ACID2 test to test browser compatibility regarding CSS2.1. That comes about four days after a lot of people did a lot of Microsoft bashing for them announcing the browser name but no other information. This bashing was about the loudest Microsoft bashing I have heard since five years when longhorn was first delayed. Some peeps - including me - have written that they abondon each and every browser hack out there whenever possible and just adhere to Webstandards as written in the W3C standards pages and checked with the CSS and HTML validators. Yesterday they responded with a "we are listening" post and a picture allegedly showing a IE8 browser window with a functioning ACID2 smily on it.
My first reaction "Yeah I believe it when I see it"
My second reaction "did all the bashing for once really achieve one of the greatest gift to the internet since the birth of the browser?"

IE8 is supposed to go into public beta next spring.

That means - No feeling guilty if you doing standard conform websites because in 3 years they will just work on any major browser :) if not then microsoft looses its last bit of credibility that they might have.

Oh Microsoft ACID2 passing comes about 4 years after Safari and 1 year after Firefox -just for the record.

Microsoft Blog Entry about IE8 passing the ACID test

Working ACID2 test to test your browser

A couple of comments on the blog over there

re: Internet Explorer 8 and Acid2: A Milestone
Wednesday, December 19, 2007 4:01 PM by Victoria
Awesome, so you'll finally be standards complient when IE8 launches in 2012?

Great news.
--------------

e: Internet Explorer 8 and Acid2: A Milestone
Wednesday, December 19, 2007 4:13 PM by Robbo
passing a test case is one thing - working flawlessly in the wild is another very different thing ...

---------------


re: Internet Explorer 8 and Acid2: A Milestone
Wednesday, December 19, 2007 4:23 PM by Chris
Finally.

I'm a Mac-only shop but had to purchase Parallels and copies of Windows XP and Vista to check our compliant websites in IE6 and IE7. It's pretty ridiculous and I'd love to send a bill to Microsoft, but I have a feeling they won't be paying it.
I have more costs in my business because of Microsoft, particularly because I demand all of our products are 100% div-based, compliant, and accessible. Quality is important to us and Microsoft should have a vested interest in making our lives easier, for the betterment of the greater Internet.

I'm happy IE8 is finally taking steps toward this. The browser is years behind.

---------------


re: Internet Explorer 8 and Acid2: A Milestone
Wednesday, December 19, 2007 6:41 PM by Aleksey V lazar
I see a smiley, but I'd like a running build to test this and other things by myself. I can make this smiley image in two minutes :)

Acid2, is by no means a complete measure of CSS standards support and CSS is not the only standard. In fact, many IE issues are not CSS-related.

I honestly think that this whole affair with IEBlog and talking with developer community, etc., is a bunch of public relations B.S. The fact that they are touting the Acid 2 test is just proof of this. Whereas other browsers have been continuously improving their standards support and then at a certain point passed this test too, these jokers haven't done a thing worth mentioning -- and then, out of the blue, look, our vaporware IE8 passes Acid 2 and we have a whole PNG to prove it!

Really, one must be naive to buy into this sort of PR/marketing nonsense and to think anyone at Microsoft (at least decisions makers) gives a damn about web developers or web standards. What they care about (by definition) is market share and share price. None of this would occur if it wasn't for other browsers chipping away at browser market.

13.12.07

Dropping full support for Internet Explorer - NOW

and any nonstandard complient bullshit.

I am fed up. I am a single person webdeveloper who just wants to present some work that is normally not even netcentric. The hours I spend to make everything I program work on every browser can be counted in the thousands by now (and I am truly not exagerating). All I will do from now on is program W3C specified code. I do not care if it looks not good in every browser out there - as long as it works in lynx(pure text browser) and gets through css and html validators 100% and looks good in my reference browser - which is WebKit (Safari). My browser statistic show a huge market drop of Internet Explorer to the point where Firefox has taken over the lead (30:50 the rest beeing Safari and all others under 1%). But its not only Microsoft its also Firefox who will suffer. The only browser that consistently shows the result I want when program to strict W3C guidelines is Safari - there is no but and everyone trying this out should notice this. Firefox with its stupid moz- tags can go to hell to. I program once and that is for a standard compliant web and I would urge every developer out there who is able to to do the same. Make the browser vendors make THEIR product work with us not us work with their products. There is now enough competition out there that the market should decide and once half the webpages look strange with a certain browser the market share will drop like a lead ball into a cream cake.
Seeing that Microsoft is probably off 2 years from Internet Exploder 8 (the major announcement was that it is called IE8 - not that it is finally standard compliant) and that Firefox just completely fucked up a layout that I created over two day through an update from 2.0.0.6 to 2.0.0.11 (textarea not accepting em input like it should). I stick to Safari as this has shown the most consistent in standard conform over the years and is the only browser who passes the ACID test and is the only browser who can perform almost any standard javscript function how its meant to.
I am not saying that my websites won´t work in Internet Explorer (or Firefox) - they certainly will - but they won´t look as pixel perfect.

Oh and I forgot to mention Opera - I hate this 1% market share browser with its own 100% quirks and I will not even test any future site in it. If you do a 3rd party browser make damn sure its 100% standard compliant because except for geeks noone will do you the favour anymore.

I hear more and more webdevelopers being fed up with the nonstandard compliant bullshit that is out there and I hope this turns into a tide that makes all developers of browsers listen and make damn sure their product is W3C standard compliant inside out. This is much more important then to supply 100+ plugins, interface skin hooks or activeNIX controls. Gets the basics done first and add nice stuff later - WebKit is the only browser platform that seems to adhere to that rule. Together with the new Debug Tools that come with Safari this is also the nicest platform to develop for (sorry FireBug you just lost a advocate as the new nightly builds of Webkit allow you to edit your source code right inside the browser!)

The great thing about going standard compliant is that once the browsers are getting standard compliant your webpages magically start looking better ;)

Movable Type is free

As noted before Movable Type has been put under the GPL (Gnu General Public License) which means its open source and free as in speech and the bare bone version also free as in beer. We have used Movable Type from the beginning and just couldn´t - wouldn´t want to to migrate to a different platform as it always seemed to much hassle. Also we coughed up the small fee for a multisite blog (that was a point where we almost switched). This is all past us now and it seems we have bet on a long distance horse with our blogging software. Now I think a company developing the software and having their business model not to sell the software but to sell distributions and support will be the business model for all software rather sooner then later (quote me in 10 years). Because with a truly open source approach you have tons of helping hands in your code to make it better more stable faster secure and saturate it with plugged in feature at no cost. You as a company are still the ones who know the software best (you coded it from the start) so you are probably the best to help big corporations to install it - meanwhile the free nature of your software allows anyone to get used to and train them self to use it, generating a legion of enthusiast who in return will advocate your software over closed source alternatives giving you company a sustainable income - everyone is happy. This approach is also in my opinion much better then the all non leader community approach of free software (for example wordpress) with no real direction and to many side roads leading to stagnation or confusion among users (drupal is also one of those beasts, buts thats about to change to).
As said I think sooner or later most of the software industry comes around this business model. I am very happy that I don´t need to think about migrating the two active blogs.

12.12.07

Ogg removed from HTML5 spec

While I am all for open standards and great free technology there is one piece of the "open source" world which I do not support and never will. Its the OGG container format and its accompanied video codec Theora. The open sourcers are pushing this standard so vocally that you would think their life depends on it. I have never understood this passion because there are obvious quality shortcomings in the standard and also its not as open as it seems - a company made them couldn´t make money off them (because they are just not up to snatch with modern codecs/containers) released them to the public for free opened the source code - but kept the patents just in case.
Now the only thing that does not apply for OGG are the royalties to pay for encoding and that you can have a look at the source code. That OGG was to be included as the recommended audio/video standard in HTML5 is a new to me but I am so much happier that this bid has been dropped (by pressure from Nokia and Apple mostly).
Now I wouldn´t care less if that would not be one of the areas that is vitally important to me. Video quality is really important and Theora falls short on every test I did compared to MP4/H.264. Its bigger has less quality temporal has less quality interframe its generally more blurry. This is why it has failed as a closed source codec and there is no reason to use it just because its open when it is actually a step backward. So I think the decision to leave the video format war open for the time beeing is a good one.
Now can they put a flash video ban in the HTML5 spec please - because even with the inclusion of H.264 its still a plugin in a plugin in a browser playing a resource hungry video - a total waste of processor cycles and something that keeps out 50% of the computers (even a core2 duo with dedicated graphic card chokes on some of the newer highdef flash videos on the net - I don´t even want to know what a 2 year old computer does)

"Ogg's video codec is Theora, which was proprietary. On2 developed it as its closed competition to MPEG-4's H.263 (DivX) and H.264 (AVC) codecs, alongside other competing proprietary codecs from Real and Microsoft (WMV). The winner to shake out of all that competition has been the MPEG-4 standard, which includes both a container and different sets of codecs. MPEG-4 is open and supported by lots of companies, and is also supported by FOSS (x264 is among the best implementations)." - DECS


http://yro.slashdot.org/article.pl?sid=07/12/11/1339251

The official discussion

1.11.07

SVG, the sorry state of vectors on the Web

Everyone knowing me knows I am a Flashhater since day one. I saw flash as a dying species when it first came out as the arcane nonstandard scripting language (why not use javascript or python?) never appealed to me, neither did the "plug in to the web" approach instead of an "open standard" open code open everything approach. Also I did not like the interface mentality of macromedia (and a recent look at firework cs3 has gotten me those flashbacks) so I never felt good working in Director or Flash or any other macromedia app - except for freehand (which I used before it was a macromedia product - Aldus anyone?).
But I do like vectors and especially do I like fonts - especially crazy nonstandard fraktur fonts. So the ability to include fonts into Flash files and make a layout that playes exactly as the author intended on all computers is an appealing one. My hatred for Flash has trumped the need for alternate fonts and true vectors to this day and will likely into the long long future.
Then about a couple of years ago (1999) Adobe System proposed a standard to the W3C - so did Microsoft to break the Flash supriority on the web with an open standard - both standards fused in t one and became SVG. SVG sounded good back then and still does - true open documented XML document format, truly open source of the generated files, and backed by the W3C - easy to generate with just a text editor if you are so inclined. I was dancing on my chair back then when I remember correctly.

There was two major problems - no easy way to generate content (other then a text editor) and no browser compatibility.

Adobe themselves tried to eliminate the first problem by putting out the ill fated program "flame" or whatever it was called. It sucked - it sucked hard - and it wasn´t for the mac. Then there was years and years of silence until there was an obscure "export to svg" menue point in illustrator in CS1 (I think)

On the browser side it didn´t look better - Adobe did offer an SVG plugin - but this eliminated one of the biggest superiorities over flash. Neither first incarnation of Firefox could play SVGs neither did Safari do a good job using SVGs (so it did at least recognized it in some sort or form without putting out horrendous errors all over the place) - Internetz Exploder I have no idea but my guess that it does not even know what an SVG is to this date without the adobe plugin.

Then Adobe bought Macromedia and one of the biggest reason for the takeover was Flash. Adobe must have seen that not being able to deliver platform/client independent interactive vector graphics with embedded fonts would sooner or later give them a big disadvantage and that making such a technology from scratch is not as easy as it sounds - especially with the browser developers trying to get the basic W3C standards to work first before implementing something flashy as SVG.

As my heart jumped at the introduction of SVG my heart plummeted when I heard about the Adobe takeover. I thought that thats it and flash is taking on the world without anyone stopping it. Flickr and PooTube just have aggravated these fears - I was waiting for adobe trying to make Flash a W3C standard or something along the line - yet something else and surprising was happening all of the sudden. Firefox and Safari started to add build in SVG support - unusable at first but nevertheless progress. Also Illustrators SVG Export became more and more sophisticated and - CLEAN.

Then we entered the post AppleIntel Adobe CS3 Microsoft Silverlight (why they did not choose SVG I can not figure out) world. A world where Safari has full SVG 1.1 support Firefox claims it has 1.0 support and Adobe Illustrator CS3 outputs clean sufficient SVGs.

Thats the day I jumped into it to see the true power of it unfolding before my eyes. I stumbled into it accidently. All I wanted to do is making a form that I use very often in Indesign more "interactive". The form in question is a template to fold you DVD cover out of a single piece of paper without glue and be able to print on the front and back side. Its one of these great origami secrets :)
Well that form has one big problem - all text and graphics are printed out at an roughly 35.3 degree angle.
First obvious choice was to make it a PDF form. After trying it out with Acrobat 8 and whatnot of small utilities I gave up because - you can can only rotate form fields in a PDF 90, 180, 270 degrees - no 35.3.
Ok I was about to give it up but I really want this form that I use so often with an easy interface without opening Indesign everytime I burn a DVD.
Next I was thinking about a database solution - Filemaker in special. Obviously I tried to angle a field there right in the beginning just to find out that there as well you can not do so either.
I did not have a solution in my head anymore. CSS? hmmmm no. CSS3- hmmm is supposed to have a rotation value but the support is rather sketchy even if you are developing for just one browser (which would be sufficient for this form as only I wanted to use it and I use Safari all day all night).
I tried to get the idea out of my head - unsuccesfully. Then I remembered SVG and briefly looked into the documentation of Illustrator CS3 and found lots of good things about - I thought that is worth a try.
I layouted the piece and exported through the "Save for Web and devices" dialog. The first thing I saw made me smile "Include Font - for used Glyphs only - or for all Glyphs or none or commenly used Glyphs". YEY fonts!!!!
Then I looked at the code - XHTML :) I was feeling all home right from the beginning.
But how do I make it interactive? I decided to go a PHP, Javascript route.
Then is when it became apparent that even all looks rosy it isn´t. There is virtually no documentional howtos on SVG on the net. The W3C documentation is so geeky that its over my head and the few howtos available where either extremely old and didn´t work or just didn´t work. I delved into it.
Four long days later I figured it out - scrapped the Javascript route and went a pure PHP route - after I found out how to integrate PHP into SVGs and after I figured out how to manipulate the DOM tree in an SVG with Javascript that sits outside the embedded SVG. In short: it was a pain!
The coordinate system in SVG is not easely understood and I had to print out 40 pages to make the resulting SVG print out correctly (only on Safari3final I might add - it does not work on any other browser) because safari scales stuff so it fits on a page - which was counterintuitive with this document. The code is intermangled and inside to outside javascript integration is not robust in Safari. Dom updates are not as straight forward as in plain vanilla XHTML. Deleting parts of the dom tree and reintroducing it also did work only after extensive fiddling (the main reason I went with pure PHP).
But the biggest biggest problem was the handling of text - how can any standard that displays text in any sort or form NOT support a line break! Yes you heard right - for having multiline text you can NOT use something of the < br > sort - you have to calculate the linespacing yourself and make new lines through < tspans > with a "y" value that specifies where you next line of text is! HOW INSANE!
In the end though I am impressed at the output because I could include a font in the document and now have a nicely working form to print out my DVD labels. And changing Layouted text on the fly through a webinterface is something novel and great.

I surely hope SVG is making progress now that its at last almost usable - but I do not expect it to gain widespread adoption until the linebreak issue is resolved - its a HUGE showstopper.

One big question that came to my mind was a copyright issue. Fonts are to design what RIAAsongs are to the ear - its a copyright regime fighting hard against the users since day one.
So if I use my owned copy of Solex and embedd it into an SVG do I break the licensing?

Why? - you might ask - Flash has done this for years - Yes, - I reply - but in Flash you do not have the easy option to extract the font from the files - in SVG every vector of the font is being readable in plain text - you can even change vectores around if you want to - its just a matter of time until someone writes an easy to use "font translator" plugin that can read the SVG and generate a TrueType font out of it. Fontshops galore will have want to have a say in this I am sure!

Overall the future for open vectors with embedded fonts on the web does look brighter after this year and death to flash and all :P

Next project regarding SVG for me will be to try to embed video in it.

24.03.07

Web Development in 2007

Or: Does Internet Explorer suck intentionally?

I have just completed a small webproject — nothing life changing, nothing fancy, nothing ridiculously cool and generally a very very small project — perfect to fiddle around with some technology that I wanted to employ for a long time. Me — not having done semi serious web-development since I redesigned this blog — was curious on a number of questions especially "What is wrong with the Internet Explorer and why is it wrong?" but we get to that way way later (the most interesting part but the rest leads to it) first was some basic fact checking:


"Whats the backwards compatibility path used these days?"

Browsers change, get recoded, dissapear, reappear and fight with each other — thats generally the situation each and every webdeveloper on the planet faces on it first website. There is no way to support every browser on the planet in every version number ever released — period. Everyone who looks at Internet Explorer 4 Mac Edition and makes a website work on that the same way this website works on a Firefox 2.0 installation is getting a lifetime "you waste your time" award.
Generally I approach this question with a very certain bias. First and foremost I want to push technology to the edge but because — as stated here before — only used technology gets pushed forward and gets redeveloped, reused and generally more useful.
But I am sidetracking. So being in the beginning of 2007 someway into a new fresh awesome millennium full of (sometimes) exiting technological advancements how far does a normal "mom and pops" website that needs to reach a broad general audience across the full spectrum of age and technological knowledge needs to push it with webbrowser compatibility. Since this websites intents to sell you something it needs to work — easy clean simple and perfect. Now if we look at browser statistics these days (which I don´t believe any of them, but I generally don´t believe statistics that I haven´t come up with myself so that point is mood) the field is WIDE and open. General consensus is that there is a total four browser engines (as for computer web browsers that is — more on that in a minute) on the market that are worth looking at.

1. Internet Exploder
2. Mozilla/FireFox/Gecko
3. Safari/KHTML/Konqueror
4. Opera

For me personally and for this specific project one browser falls out right away from consideration. I am really sorry my open source, we make everything different then the next door guy just so we are different and are percieved cool friends — Opera is a nonevent for me and I would venture to guess about 99% of webdevelopers. Yes according to statistic Opera has a whopping 1.5% market share. I meet only two persons in my whole life that use Opera and if those two persons (totally unrelated to each other) give a picture of the Opera using population then its save to say that they are technology savvy enough to figure out that when a website doesn´t run it might be the browser and I am sure they have 10 other obscure browsers on their machine to access the site. That goes also to the Opera Team — if you want your browser to be used make it 100% CSS/HTML/XML standard compliant 100% Javascript/DOM compliant because the webdevelopers have a life and really there is enough problems to fix then looking at every obscure "me too" browser on the market. I really do love and try to support small software rebells but my prior experience with Opera was so BAD (in development terms) that I am absolutely sure ditching this will not cause any ripples in the space time continuum and give me at least 10% more time out of web development to rant here.
With this out the way you dear reader might say: "Hey but what about Safari/KHTL its similar obscure and has a similar small market share." Yes, dear reader on the first look that might seem so, from personal experience I can name about 100 ppl (compared to the operas two!) using Safari daily, because it comes with a pretty more or less widely used operating system called MacOSX and as it is with these bundled apps — some people think they have no choice as to use them. The great thing about Safari — besides being bundled and forced down mom and pops throat — and totally unlike Opera (never used 8 but used 7 and 6) its about the most standard conform browser on the planet — yes even better then Firefox. Its even recommended as a standard reference platform (where ever I read this if I find I post a link). So even with a tiny market share that I personally would think is really at least five times as much as in the "statistic" the developers of KHTML/Konqueror together with the enhancements of Apple did something Opera has utterly failed — eliminating the need to specifically test for this platform — when you adhere to the standards set by W3C you can be 98% sure that it runs, looks and works beautifully. Thats in Javascript/DOM, CSS, XML, XHTML.
Another great thing about it is, that its automatically updated (the Safari variant — Konquerer users prolly are also very up to date as Linux users are in general) with the system so you can be sure that most people using Safari are also on one of the last versions. So testing is constraint to 2.0 and onward.

Moving up the percentage ladder we approach the burning fox. While I was not very impressed by the early siblings (first Camino f.e.) Firefox is now a stable mostly standard conform plattform and with the FireBug Plug-In has become my main webdevelopment choice (this might change with OSX 10.5 but I can´t say anything — NDA). So its clear that my developments will work out of the box with Firefox and hopefully all Gecko compliant browsers. So how much versions back you need to test? I don´t test anything before 1.0 because before 1.0 most people using Firefox are assumed intelligent fast technology adopters and they prolly have the latest version of Firefox installed. Actually I am not even testing on later versions then 1.1 at the moment because I think the majority will only upgrade to X.0 releases and they hopefully didn´t break things that were working (and you can not under any circumstances be responsible for any and all nightly builds there ever are for any open source browser anyway).

With those out of the way we get to a point where things are starting to get ugly. This point is the internet venom called Internet Explorer — or nicknamed by about every serious web developer: Internet Exploder. In case you have not heard of it — its the browser that Microsoft "pushed" out the door in the mid 90s to combat Netscapes dominance in the early internet. Its the browser that started that browser wars, killed off Netscape (temporarily) and has since earned Microsoft a couple of Antitrust lawsuits and A LOT OF HATRED among web developers of all kinds. The problem is: Microsoft won that browser war (temporarily) and the antitrust lawsuits have done nothing stopping the bundling of that browser on the most used Operating System in the world — namely Windows that is. So with about 60% browser market share as of last month (if you want to believe the linked statistics) it has more then double of Firefox and just can´t be ignored no matter how much you swear. Now all this would only be half as bad but those 60% are quite unevenly distributed between the three main version numbers 5.0, 6.0, 7.0. And looking at the individual percentages, each has more then double the percentage of Safari so you better support em all. Heck I would even throw in 5.0 Mac edition for the fun of it because I have personally witnessed people STILL using that! Now a person not experienced in webdesign might say "hey its all one browser and if you don´t add any extraordinarly advanced function 7.0 should display a page like 5.0 and everything should work great.
Well without going any further down a well worn path I can only say this: It fucking doesn`t. If you need to support people using Internet Explorer you need to go back to at least version 5 including the Mac Edition.
Now if Microsoft would have tried to support web standards like they are kinda set in stone by the W3C this would all be only half a problem. Microsoft has chosen to go down their own path and alter little things in the W3C spec — mostly known is the box modell difference in CSS.
(I am going to get inside that in a second — just need to find a way to round up this section.)

What I haven´t touched yet — because a clear lack of experience — are phone and other device browsers (gaming consoles). For this specific project this was no issue as I think the people using a phone to order a highly specialized documentary DVD is close enough to 0. Gaming consoles are definetly not the target group of this DVD either. For up and coming sites out of this office I will clearly look into the "outside of the computer" browsers and will surely post my findings here — generally I think they all going to move to an open source engine like Gecko/KHTML sooner or later (iPhone will drive that, Nokia already decided to use KHTML f.e. the Wii is using Opera - tried browsing on the wii and it sucks bad compared to all the other stuff thats beautiful on that machine).

To round this up: If you want to reach the mom and pop majority of the Web I concluded you have to test on Internet Explorer back to version 5 (including Mac Edition), Firefox 1.0 and upwards, Safari 2.0 and upwards. You also may want to check your site with a browser that neither does Javascript, Pictures or anyhting in that regard to make sure your site is accessible for the blind and readable by machines (google robots f.e.).

Now with that out of the way the next question formed in my head:


What content management system should I use?

While this specific project has little content updates, it still has some (adding new shops that sell the dvd to the reseller list f.e.) and more important — it had to be deployed bilingual. So both of these consideration prompted me go with a CMS. Now I don´t know if anyone has looked at the CMS market at the moment — I have done some intense research (also for a different project) and its quite a mess. There are basically two types: Blogging centric and Classic CMS centric and a lot of in between breeders.
Since I don´t want to bore you too much: most of the Open Source CMSs can be tested out at the great site opensourcecms.com.
Personally the only CMS I have used (and I am still using for various reasons, but basically I really do like it) is the blogging centric Movable Type (not open source and costing some dough). But Movable Type is so blogging centric that doing anything else with it is counter productive (but can be done). So me — feshly in the CMS game with knowledge that "blogging centric" is not something I want here — looking at all the options found out that its very hard to decide on one from pure looking. The user comments on opensourcecms.com are very helpfull already in siffing out all the once that had pre beta development status. Left over are basically the big three CMSs Typo3, Mamboo, Drupal and Plone. All with their own good and bad sides. The one from pure technological standpoint and feature wise and stability wise that I really really liked was Plone, but Plone depends on Zope and for Zope you need a very heavy duty server that runs just that — I don´t have one. The learning curve for Typo3 seemed much too high for me — thanks I am already used to PHP/Perl/XHTML/Javascript/CSS etc. and I have no need to learn another obscures description language on top of that just to use a CMS.
This left with Mamboo and Drupal as the likely choice. Mamboos architecture seems dated and is at the moment in a state of flux and recoding — I do not like unstable software and have no need to beta test more warez then I am already doing — so mamboo called out. Drupal came out as the winner in the CMS field at the moment — but NOT SO FAST my friend. I installed it used it. It has a flashy web2.0 interface with lots of usefull functions. Well there are SOOO many functions that would never been needed for that project. Also it is very heavy on the server (and I had no urge to get into the 200 page discussion on optimization techniques for drupal on their forums) in the default install. It became clear that this CMS is not the solution, the only function that I was really looking forward too was including Quicktimes and MP4 in an easy way. It turned out that including these is easy — having them show up in the way I like it and not the Drupals developers vision of "another tab in a tab in a window in a frame" proofed also extremely difficult.
Now this left me with either going with a fast hardcoded website that I would need to maintain the next 5 years or dig up a CMS that I used before and had almost forgotten about — x-siter.
This CMS is written by the fellow CCCler Björn Barnekow and is totally unsupported in any way other then "ask him and he might reply". The beauty of it is that it is ULTRA lightweight — he himself describes it as the smallest (code wise) CMS out there. It is totally PHP and even if you have no plan on PHP its very very easy to understand what it does. From an enduser perspective who needs to maintain the site, the approach is unrivaled and beautiful, because you just edit paragraphs of text, add pictures etc on the page they belong to. So no fancy metastorage system for all the pages or folders containing the pages you edit it right inside the individual page. Now this has a huge advantage if the person you need to explain how to update the site is you general secretary or such because she browses to a page she wants to change then logs in edits it and logs out — its very very close to WYSIWYG editing and very easy to explain to everyone.
The layout possibilities with x-siter are also well thought out, giving you an adjustable typographic raster that you can put pictures in or text etc. A very nice approach. So why is not everyone with a small website using x-siter then and why has nobody heard of it? Well first of all its more an internal tool for Björn that he shares without any official support and not much documentation inside the code either. He thinks you might not need to touch much code and generally he is right, sadly design realities are different and I have a concept of how a website needs to look like in my head and if the CMS tries to get me to do it differently I rather adjust the code of the CMS then adjust the look in my head. And this is where the x-siter shows big weakness because the code is not very modular and not very good commented so I had to change quite a few stuff and now the code can not be easily upgraded. But generally if you need a very fast small site that needs to be updated now and then x-siter is definitely worth looking into. Even one Max-Plank-Institute switched from Plone to x-siter because its soo much faster and actually scales quite nice and has a good user account management. So it does lack some advanced things from Drupal, generally I do not miss those feature (and most can be added through different things anyway).

So I employed x-siter and heavely modefied it to get my certain look that I wanted (taking some headlines out of tables and into css divs etc). Since the site is pretty simple I needed an excuse to implement at least one advanced cool dynamic function in there.


What cool "new" "web 2.0" technologies are really cool and worth testing out and are generelly usefull for this small website?

Well I couldn´t find much. First I thought I rewrite the whole CMS to make it fully dynamic (without any full page reloads — imagine that) but thanks god I did not go down that route. There was one function of the page though that definately needed some design consideration. Its a function that on return kinda breaks the whole look and feel of the full app by generating an ugly error/confirm html page.
That function is a simple formmailer to order the DVD. Now this function — needless to say — is also the function that is most important on the whole page.
So my thinking went down the route of "Hey I want to let the return page of the formmail.cgi replace just the "div" of the form. If there is an error I should be able to go back to the form and correct it (without having to completely fill it out again)."
Great thats a simple AJAX request to the server and putting a returned HTML into the DOM of the current page with the option to return to a saved state of the DOM. YIPPEEE some coding fun — or so I thought.
Generally implementing this with FireBug and SubethaEdit was so dead easy — almost scary (look at code on the bottom) easy. Here is how I did it:

First I replace the normal form button with a ordinary "javascript:function()" link button inside a Javascript below the normal form button. That ensures that people without javascript still can order the DVD through the normal nonAjax/ugly/unintuitive way of doing things in the last millenium. Those people get a normal submit button, while people with Javascript enabled get the AJAX button because those people should also be able to use the AJAX functions.
So user fills out the form and hits the AJAXified "submit" button. The formdata is then send over a standard asynchronous connection through a standard xmlhtrequest. At this point you already add browser specific code but this has been figured out and the code you add JUST for Internet Exploder is already 3 times as long as the code would be if that browser would function normally.
Anyway the data is then processed using the normal nms formmailer.cgi. And this returns a (sadly non XML) html page. I then parse this HTML and look if its an error or if its even a server error or if the result is an "ok" and then let specified output for each case drop into the DOM if the webpage in the only correct way (which is NOT innerHTML!).
Before I exchange the data of the webform with the result page I save the webform with all its content in a DOM compatible way use cloneNode. (its just one line of code I thought!). So if the result is an "ok" I purge the stored data and tell the user the order is beeing processed and this and that data he has send. If there is an error there is a javascript link in the result page that when clicked on exchanges the result page with the form and all its content.
So far so good. This part coding with learning the whole technology behind it took me three hours.

So the website looked great the functions worked as expected and since I am hyper aware of the CSS box model issues of Internet Exploder it even looked great in Internet Explorer 5 on the Mac. At that point — a point of about 20 hours of total work (including digging PHP of x-siter!) — I considered the site done.
BAD idea.


Problems with Internet Explorer 5.0, 6.0, 7.0

First thing I noticed was that IE 5.0 Mac Edition does not support xmlhttprequest AT ALL, also it does not do any DOM. That made me aware that a very very few users might have browser that a) has javascript b) but does not support any of the modern AJAX and DOM functions.
Well that was easily fixed by trying to figure out in the very first called script (the one that replaces the submit button) if xmlhttprequests can be established. If not its the ugly normal nonjavascript version — if yes then the browser should be capable of getting through the whole process.
Again a sigh of relief from my side and 5 minutes of coding later I closed up shop for the day. The next day would just be some "routine" IE 6.0 and 7.0 testing on the windows side and then the page would be done — so I thought. I was very happy because I had very portable future proof code that was small and lightweight and easy to understand. There wasn´t even a bad hack in there except for the xmlhttprequest check.
Opening the page on the Windows PC with Internet Explorer 7.0 my heart dropped to the ground like a stone made out of lead in a vacuum on a planet the size of our sun.
Nothing worked. The layout was fucked (I am quite used to the CSS box model problem so I avoided that in the first place!) and the submit thing did not work at all - clicking on the button didn´t do shit.
The CSS was fixed after fiddling with it for two hours, there seems to be a "bug" in IE 7 with classes vs. ids and z-depthes. Fixing the Javascript was much harder. I couldn´t use FireBug to get to the problem — because in Firefox the problem didn´t appear. The IE7 debug tools are as crude as ever (a javascript error console, which did not produce any errors).
So I placed strategic "alert"s in the code to see how far it would get an what the problem was. It turned out the first problem is that it can not add an "onclick" event listener to something I changed inside the DOM after the DOM was drawn (Add 3 hours to the time total). I struggled for a solution and rummaged the web for any clues. It seems that IE up to version 7 (that the current one!) can not do "setAttribute" as the W3C says but instead you have to set every attribute through (object.attribute = "value in stupid string format";) so for a link its (object.href = "http://your link";) instead of just putting it all in through object.setAttribute("attribute", "value");
Now if you think you can also add an event listener this way by doing "object.onClick = "function";) forget about it. It seems through extensive testing (2+ hours) that there is currently absolutely no way to add an event listener through normal means to an object that was created after a webpage was build — in Internet Explorer that is — again Firefox and Safari this works wonderfully. So my solution was to use the "fake" onclick through a href="javascript:function();" thanks god that this existed otherwise I would have been forced to either write 200 lines of codes for custom event notifiers (which I am not sure that they would have worked) or abonden the already working approach alltogether.
If this still sounds too easy as to call a serious deterrent yet — this was not all that solved the problem. Because creating an object and then before writing it in the DOM setting the href attribute to the "javascript:" also does not seem to work in Internet Explorer. I had to actually write it in the DOM pull the object ID again and then change the href attribute. This doubled the code for this portion of the javascript.
Now the problems were far from over yet. As you might remember from earlier I save the part of the DOM containing the form with content so I can get back to it in case there is an error with the form. This worked great in my two standard conform browsers. So I am at this point where every important browser can get the form mailed out. If you have an error you get the "back" button which through the object.href="jvascript:" way I could also make work in IE. Now I was sweating that this whole "cloneNode" might not work in IE and I would have to parse the whole form and rewrite it, but clicking on the now functioning button did in fact work as advertised (by me) and I returned to the form. But the trouble was far from over because now my "submit" button didn´t work again. Imagine that — I am cloning the DOM as in CLONING that means I take each object with all attributes and values and CLONE IT into memory. Then when I put in this CLONE this should be just like the original — well it is in Firefox and Safari. Internet explorer seems to choose which kind of attributes and values it clones and which not because the values of the field had been cloned and correctly copied back, yet the href attribute seems not clone worthy and is completely forgotten. At that point I had been sitting a full day at debugging IEs Javascript/DOM implementation. So on the third day I just made the dirty "grab the object again and change the href manually" hack and let it be.
In general I have a recorded 27.4 hours of development/design for the whole page including php scriping (that I have no clue off) and 13.6 hours of IE CSS/jacascript debugging (where I have a very thorough understanding). My code bloated 1/3rd just for IE hacks and is less easily readable or manageable in the future. And it definitely runs slower, not that you notice in this small app but extrapolating that to a huge webapp like googles spreadsheets I think the speed penalty is mighty.


Why is Internet Explorer so bad?

Throughout that project (and many before) I have asked myself this question. I am no programmer and have no degree in computer science, yet I think that it can´t be THAT hard to make a good standard compliant browser. Microsoft had about seven years time (yes SEVEN YEARS at LEAST) to make Internet Explorer standard complient, listen to the complains of the million of webdevelopers and redo the whole thing from scratch — in the meantime Firefox has made it to version 2.0, Safari has made it soon to version 3.0 and even a small shop can churn out a browser with version 8 — even so its not perfect Opera is still MUCH better then IE ever.
Now Microsoft is swimming in money and has thousands of coders in house who all probably are a million times smarter then me when it comes to programming. The bugs that are present are enormously obvious and waste millions of hours of webdevelopment/design time. The box model problem should have been just adjusting some variables and the javascript engine — well after a rewrite of the program with native DOM support (at the moment its a hack I would say at best) all the problems should be gone.
Now Microsoft had the change of fixing this with Internet Explorer 7 and while transparent PNG support (a hack) and finally a fix of the box model problem (also not 100% fixed so I heard) has been touted as breakthroughs in supercomputing or something the whole DOM model Microsoft uses does not work (and they admit that on the IE dev blog — promise to look into javascript/dom for 8.0 — in 10 years). That at a time when the whole web wants to use DOM model stuff to make rich web applications with great consistent interfaces. I have looked into some of the AJAX make my webpage move and look all funky frameworks and I can tell ya -> they have the same problems as me and sometimes more then half the code in them is to get around IE limitations — which slows them down to hell I guess.
So IE 7 is almost a non event — I am asking now even louder WHY COULDN`T MICROSOFT FIX THOSE THINGS.

First my personal take on this — they have no clue — this multibillion dollar company has no idea why someone would want to consistent interface on a website that doesn´t reload just because a number changes some were down in the text of the webpage. The reason I think that: Look at Vista. Vista is flashy, has an inconsistent interface (I just say 10 functions to shut your computer down!) and uses every processor cycle available on your CPU and GPU just to run itself (so much that not even cheap modern laptops can run the full version flawlessly and fast). So when they not realize that this is important for themselves why would they realize that these are important concerns for developers outside the Microsoft bubble.
Now pretending that a multimillion dollar company is too dumb to make such assumption is probably as dumb as to think that the Bush administration has no advanced hidden plan on what they are doing (well you know as with microsoft — they could be just plainly dumb or have some greater goal that nobody fully understands or if they understand are not having a loud enough voice to make it public).
So since we are not dumb over here we stumble upon ideas why this is all the way it is. The best take is by Mr. Joel on Software in his blog entry called API Wars. Joel is a software developer writing Bug tracking software for Microsofts Operating Systems. He is very known and very respected by the developer industry and his sometimes controversial statements cause quite a stir now and then but he — being very much inside the OS and probably able to read Microsoft assembly code backwards — is most of the times right spot on. In the linked article he talks about how Microsoft has always been about protecting their main treasure — their API. The Office API with the closed document structure the crown jewel above everything else. Well he also talks about how they have so many APIs that developers are getting confused and since most of the APIs are for sale developers nowadays turn away from the APIs Microsoft provides and are turning TO THE WEB to develop especially small applications — the reason most shops are on windows is believed to be exactly those simple small application only available on windows + office.
Now I said "the web" — the web has the notority to run on any OS somehow and manages to circumvent Microsofts monopoly in a nasty uncontrollable way — poor Microsoft.
Now you probably understand where Joel and I am are getting too — yes Microsoft does anything to stop the web from spreading too fast and get too usefull before they haven´t found a way to completely control it — and they try so hard, DRM, TCP etc etc are all designed to control webcontent — good thing that webapps are not content and Microsoft is acting too slowly. When you read through Joels entry you get quite a clear understanding that Microsoft is not interested at all to make webdevelopment easy and the longer their Javascript does not work and their interactive DOM Model does only work in some strange emulated way (change something in the DOM and look at the HTML source — you will not see you change reflected there) the longer they have a monopoly — and that their monopoly is threatened by webapps is apparent by Googles spreadsheets and word app — sadly for Microsoft these already run even on a broken IE 7 (poor google guys who had to debug that).
I do see that Microsoft will try to "fix" these problem — because a) this gets into a publicity backlash that not even Microsoft can hold out against (the BOX model problem circulated for years but then gained so much steam that Microsoft HAD to do release a press release stating that they are working on IE 7 that adresses some of those things — that was three year ago or so). Because so many developers/admins/techddicts/etc. suggested to friends that using Firefox is more secure (thanks god IE has had soo many open doors for viruses) that Firefox usage exploded and is now eating slowly but steadely into the IE market share. Now Microsoft faces a very big problem — a problem so satisfying for webdevelopers that I would sell my kidney to see this unfold (maybe not but I would hold a party just in case of that event). Microsoft understood in the mid 90s that if they have large enough a market share they can force any competing browser out of the market by introducing proprietary standards (ActiveX, own implementation of CSS and many more) because the webdevelopers are going with the browser with the biggest market share. That worked quite well and played out like they intendet — almost. The webdevelopers are a strange bunch of people and some are using Linux and other evil stuff so they ensured that big companies tried to stay cross browser compliant (I am one of those who wrote about 100 emails to big companies telling them their website doesn't work on browser xy and the millions others doing that are responsible that we do not have ONLY Internet Explorer today — Microsoft really almost completely won — this was close). Now back to the point and to the happy future lookout. If Internet Explorers market share would drop below — lets say — 10% I am the first person who would drop Internet Explorer support completely and rather debug small things for Opera in addtion to the two nicely working Firefox and Safari browsers. My thinking is that the hatred for Internet Explorer among webdesigners/developers has grown to such proportions that I would NOT be the only one — heck I would even say this browser fades from visibility in less then a year. This would satisfy so many people who have lost so much time in their life just because some stupidly rich multibillion dolllar company wanted to get richer on their backs. Now this is not unfolding right this minute but I can see that if they wait too long with IE 8 and fixing the aforementioned javascript/dom problems this might go faster then everyone think. The webtime is a hundredfold acceleration of normal time and a monstrous creature like Microsoft might not be able to adjust fast enough — one can really hope.

6.06.06

JumpCut: Web 2.0 meets iMovie

JumCutLogo.pngRising to prominence among the home video editors through its Scanner Darkly Remix contest that has been reported all over the place it gains popularity about as fast as YouTube did just 3 month ago. Now this in itself wouldn´t be worth reporting here but there are some ramifications that go along this that makes it a much more special case compared to the relatively straight forward YourTube.
First of all it gives all a new meaning to the "remixing is active consumption" metapher as it basically lets you remix all and every movie on the system inside the browser without further software and just a few mouseclicks all in the "easier is not possible" iMovie interface. So armchair cutters can now remix the wedding video of one guy with a porn scene and publish it on their site (I would think that there might be rules as its a "bussiness" and not a wild free internet website but its worth a try anyways). This puts YouTube with its closed you can never upload it to your iPod or do anything else with the watermarked blurry rererecompressed video concept in the backseat. Not only are the videos not watermarked they seem to be raw converted judging the quality of some (upload raw uncompressed and its compressed only once) hey and you can download them and without any investment in expensive hardware that delivers you the only software amateurs seem to be able to use sufficently these days (read the title) you can roll your own or destroy others. Very cool indeed.
But what is even more important about this - and forget about all this videosharing for a sec - is thats a fully functioning editing program (well almost) inside the browser. Well well that is what everyone wants - making software that runs without OSes in the browsers on some distant server - seeing that happening so fast surprises me a bit and makes me utterly scared - think of renting software by the hour and such things that are possible with that kind of technology. The software industry wants that - that has been stated more then once - it would put the death knell into software piracy without any way to escape and would make sure that all those monolpolies are nicely guarded against competition. Of course this is all a bit off - but that video editing is one of the first implementations of this strategy - even before word processing or spreadsheets (oh no google is supposed to make a spreadsheet I just read today somewhere) got there is a bit worrying as it demands much more data-throughput and processing power. Interesting times to come.

Link to JumCut

PS: Oh no they have the same green as the Lifeform - no I didn´t know before....

3.06.06

The artificial lifeform has a new heart

KC85UserManuelCoverPic.jpegWell well. It had to be done at one point I just thought I could put it off until I am finished with school and have a bit more time at hand. I upgraded the prototypen.com blog software to movable types 3.2. As said before I was looking into migrating to a nonproperitory solution but this would have cost me even more time and all other solutions out there are just not so well when you have a multiblog/multiauthor environment and I just want this to work without having to adjust things all the time - because if I hate one thing then its webdesign and programming. Because of the older blacklist version ceased to function the only other viable solution was to upgrade. I can not say how much pain it still was. I don´t know what the people at sixapart smoke all day but to change EACH AND EVERY CSS HOOK made "moving the templates to the new version" just not possible - the comments wouldn´t work or this and that would not work - you might even think that they made it on purpose to sell more "service contracts" that offer you painless upgrades. Not only did they change every CSS class name the new names utterly made no sense whatsoever calling the side navigation "beta" is the most stupid thing I have ever heard - you always think its something that is in beta and has not been fully implemented - logic descriptions that could be readable by other people seem not to be in the knowledge base of the movable type originators. To make things worse - all my old templates lost there formatting leaving me with a garbled mess of code (and I do style my code so I can change it later on). And then there is something called "clever commenting" that makes other people read your code easier - especially when you absolutely know that other people need to understand your code in order for your program that you sell for lots of money to work. Needless to say that there was only rudimentary commenting in place - and most seemed to be left in for the programmer himself as it the codes and shortcuts seemed utterly non understandable.
So for all you out there looking for a blogging solution - Six Aparts Movable Type is NOT IT. I am stuck with it and I hate it and throwing money into their mouth is really not the thing I really wanted to do in the first place but could not circumvent - if I can help that they are utterly destroyed tell me how and I am on the front line.
The reason why I had to upgrade - the spam plugins - while looking shining new and nice on first view do nothing different then the blacklist plugin - except for a few functions that are tied in the proprietary "Type Key" registration that is also a Six Apart project. I am utterly against any "proprietary authentication" entity and so I choose not to enable this - well I have more spam then ever - not that its filtering through to the blog because I am moderating almost all messages now but its sitting there waiting to be looked at and handpicked by me and I thought that those times would be over with the upgrade... SixApart if you read this - you have a very very very unhappy customer. Not only was I forced to pay for an upgrade from a former free version of your software your software sucks in all things that are not basic CMS especially for people wanting to alter it and people with a spam problem may a fast crash of the economy be your grave.

One positive thing is of course that I had to touch the design side of things with the blog that has not seen a design upgrade since day one. So I am not fully happy - especially do I hate fixed width websites - the new design is a bit more to my liking and I have a blog logo now - something that I always thought missing. Originally it was to become the picture on this entry (the artwork is from the cover of the KC85/III east german "home" computer that I started of with venturing into digital territory when I was nine years old) but I decided for a simpler solution and the ascii face representing the artificial life is better so I think. As for the fixed width: I will probably go into that at one point when I have more time - at the moment it has to be like this - even so it goes against all things dynamic web layouts should inherit - blame Six Apart not me.

7.02.06

FON vs. Freifunk - the p2p revolution goes network hardware

fonblog.gifI was recently ranting about the google empire and showed off a p2p websearch engine that is since then gaining some steam - yAcY. Right on the heels come the p2pification of the internet in general with an announcemt by fon - a spanish startup - that they secured venture capital by evil google and skype to make an p2p network happen with existing hotspots in peoples home. You sign up with fon get their $25 router or flash your linksys router beauty with a new firmware and then share your internet connection with the outside world - for free to anyone who is also offering a fon based hotspot and for pay for everyone else wanting to use your network connection. The catch is that willing ISPs get part of the cake (income) if they participate with the fon service. A very sound concept with a good bussiness stragedy behind - it makes people money if they offer the fon service and encourages fon use among other users - the backing by google and skype will help with mass adoption.
Yet in the light of this there is a shortage of reporting about other hotspot endeavors. The most prominent in Germany is freifunk.net - let us create mesh networks that are free throughout the country is the motto - taking the ISP out of the equation completely by creating a world wide mesh network - for free to use by anyone and everyone interconnected to the roots. So no financial backing - only volunteer service - a overdesigned website without any "maps that show me where freifunk is available" no press department that spreads the message to the masses will ultimately lead to failure of this and every other free wireless mesh networks - as sad as it is - bussiness world you only survive if you play by the rules - Yet in Berlin there is a recognizable freifunk scene and in some prominent spaces mostly in the east you can already get free netz with freifunk for a while - so I found the service lacking speed and reliability every time I tried it. Generally this is the project to support - but I do give FON the upperhand on worlddomination of hotspots if they survive the technical difficulties of the startup phase.

2.02.06

Google between God and Death

An interesting article on CNNMoney that is talking about the future of the Search Engine Emperor that we all have that addiction to when it comes to expanding our mind with more or less usefull information from THE net. They interview Ray Kurzweil and a couple of other researchers, futurists and bankers to foresee the future of google. And it goes like this: Either its becoming the evil Media Mogul, or it becomes the Emporer of the Internet, or it will become God once we are ready to upload our soul into the GoogleCollectiveBrain, or it will simply die under its own weight. I would say after seeing the recent discussions on how they do in China and how they would bow down to any government other then their own the active netizens are already becoming aware of the Google gatekeepers power and only one small solution somewhere that would rival the search engine - which is still the core of Google - would take them out immediately leaving shareholders running away as fast as possible leading to a total crash. My bet is here on the distributed search engines like YaCy taking the google crown - especially in the fascist american reality.

8.12.05

Google will eat itself

gwei_seal_s.gifAn art group from Switzerland(?) is doing a very "fun" things to demonstrate problems in the online space vs. capitalism. The concept is to automatically buy google shares whenever someone clicks on a google ad on one of their hidden pages throughout the net. With that they think they can own whole of google sometimes in the future and distribute the online data giant with its market value that is higher then all of Switzerlands banks together back to the users (Google to the People Public Company Ltd.). If successful it would deconstruct the global online marketing mechanism and the system of capitalism at a whole. I wish them lots of click - google back to the base.

Google will eat itself.

11.11.05

What the Web 2.0 really is....

Ah the buzzwords are swinging into gear again with the emergence of the "Web 2.0" metaphor or otherwise known as the .org bubble. The highly cheeky Register has asked its readers what they think of the Web 2.0 and the answers are extremly fitting:

Web 2.0 is made of ... 600 million unwanted opinions in realtime
Web 2.0 is made entirely of pretentious self serving morons.
Web 2.0 is the air for the next bubble
Web 2.0 is made of ... marketing and collaborative self-deception
Web 2.0 is made of ... Segway spare parts
Web 2.0 is made of ...ideas without Foundation

well you get the idea and there are many more there. I highly agree with most of them - says the 1 of the 600 million unwanted opinions contributor. Untag.

19.10.05

What(s) the font?

Sometimes you more or less accidently stumble over website that you wish you would have known before. One of those came under my fingers today. Looking for a font I could vaguely remember some know fonts site. Linotype.com or MyFonts.com. The later one pooped up during the dot.com area and I wasn´t even sure if it still existed. Even the more surprised I was when I had a look at their vast catalog and the easiness of buying fonts from then - if I would choose to do so. The most amazing feature of all revealed itself under the Whatthefont" button. It prompts you for a picture on your HD or a picture URL on the web and then with a little help it fiddles out the type for you and presents you with a list of typefaces that could possibly match the one you gave in. I tested it with a very bad pic from the web and it accurately presented me with a couple of typefaces that looked extremely close to the one on input (so not exact but that might have been that the type was custom made). This has probably been around for quite some time and I just never found it when most needed but a very good link for those designers that have those clients that don´t even know their own in house font...

13.10.05

Microsoft asks Webdesigners for help

No I am absolutely not kidding. After years and years of ignoring the standards set by the worldwide official acknowledged internet standard setting body the W3C Microsoft is now figuring that it better follows the standards with its up coming Internet Exploder 7. Now you may think "that is noble" - the irony on the story is that this will brake compatibility with the hacks that the webdesigner employed during billions of hours just to make their website work with Internet Exploder 6 and smaller - and every webdesigner in the universe can tell you the pain and lost hours felt because of the past incompatibilities - some have completely lost their nerve developing anything for the web. These old hacks that build on errors in the Internet Explorer code are now haunting Microsoft as it switches to standard conformity. It seems that Internet Explorer 7 is not displaying any CSS with hacks in place correctly - box model, fixed positioning etc etc. Microsofts solution? Ask the webdesigner to correct their code and write another set of code just for Internet Explorer 7. They MUST BE FUCKING KIDDING. Webdesigner have now developed a deep ensuring hatred for anything coming out of Microsoft in connection with the web and I predict either two of the following reactions to that "call". First their will be types of webdesigners who do not see that the Windows XP+ only Internet Explorer 7 will catch enough market share to reward changing the code - so in effect leaving all the hacks as they are and breaking IE 7 compatibility. Secondly their will be the purist webdesigners who take out all the hacks they can as fast as possible as they see this as the way out of the year long standards hassle and put a "IE7 and other standards conform browser only" sticker on the site. This will haunt Microsoft in the years to come and surely will eat into their market share as more and more consumers see them self confronted with websites that just don´t work with IE or look butt ugly or strange or whatever. Someone on the heise.de forums posted the Goethe classic "The ghosts that I called" and I can only say the master might not come in time...
Personally I never got around to use any of the hacks - all my code is 100% standards conform and it is and was always possible to do about any design with this - it just took more time - now I am happy not to need to change any code on any of the sites I created...

Read the Microsoft cry for help here.

5.07.05

The future of the web is not the browser

incompixow1_0.pngMaking tools myself that take information in form of rss feeds and xml streams etc. and putting them into new cloth I get more and more the feeling that the future of the web sits not inside the browser window. As more and more information comes available this information craves for the attention of the world population and to distant itself from other information the browser seems not the best tool. I am not talking about flishy flushy flash sites I am talking about proper use of this information and using them in a manner that presents the information so its easier and faster to understand and comprehend. Its tools like widgets screensavers blogtvs tickers and tools that will yet have to be invented that will take the web to a version 2.0 where information is not only many and pretty to look at but also will be presented in a form that will make it easier to follow the information trail that is of interest to the viewer. For now take a look at my screensaver that I did for the incom workspace tool of the University for Applied Science Potsdam to see a concept of what I mean. Pictures and user definable RSS streams are mixed to give an outsider an overview of the school and its workwebsite presence, the newcomer a way to find use for the tool and for the insider user a nicely way to follow what is going on while taking a break. I am not saying its anything new I just come to the conclusion that this is where part of the web is headed and I really like this.

PS:The screensaver is programmed in Apples Quartz Composer and runs only under MacOSX 10.4.1 Tiger and with a graphic card of at least 16 MB. I would like to thank Pierre-Olivier Latour the Quartz Composer Architect for his help throughout the little project.

21.05.05

Distributed BitTorrent Tracking

Via hackaday comes info that the java BitTorrentClient Azureus has now the ability to use a distributed tracker model. Now all the admins of bittorrent trackers can take a deep breath and just offer "link collections of some hash values of some files floating around in some distributed tracker client network that is non observable and non defeatable by the Music and Film Associations of worldmerica."
The distributed tracker model is an implementation Kademlia distributed hash table overlay network. Another step ahead into unregulated internet use.

12.04.05

Tagging - what and why

The question of how useful tagging really is came up during a class as part of my communication design studies. Since tagging is a very hot topic today I asked a community about tagging that is as hot as tagging: the video bloggers. I know the people who program ANT are on that list and are thinking about the tagging problem at the moment. Here are some answers to my sometimes naive questions:

Continue reading "Tagging - what and why" »