Main

19.01.09

The Oamaru Declaration of Independence

Via Citizen Renaissance comes this highly thoughtful text. A redo of the Jefferson declaration of independence by Dr. Susan Krundieck of the New Zealand Transition Towns Network. It tackles everything there is wrong in a some short paragraphs - to bad its just words again.

When in the Course of human events it becomes necessary for one people to dissolve the economic bands which have connected them with another and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.

We hold these truths to be self-evident, that all people are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, Justice, the pursuit of Happiness, a Healthy Natural Environment and Sustainability for ourselves, the Third Generation and the Seventh Generation.

— That to secure these rights, Organisations are instituted among Communities, deriving their just powers from the consent of the Members,

— That whenever any Form of Economy becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute a new Relationship, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Economic Relationships long established should not be changed for light and transient causes; and accordingly all experience hath shown that mankind are more disposed to suffer, while evils are sufferable than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them and their environment to ruin, it is their right, it is their duty, to throw off such Economic Constraints, and to provide new Guards for their future security and sustainability.

— Such has been the patient sufferance of this community; and such is now the necessity which constrains us to alter our former Systems of Business Growth for its own Sake and Environmental Exploitation. The history of the present Theory of Economics is a history of repeated disasters, injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over communities and the environment. To prove this, let Facts be submitted to a candid world.

The Growth Economy for its own Sake has plundered our seas, ravaged our coasts, felled our forests, polluted our water and fouled our air.

The Growth Economy for its own Sake has exploited our talents, put us into debt, degraded our culture and eroded our relationships with the members of our community.

The Growth Economy for its own Sake has exploited mineral resources which by right should belong to people in perpetuity in order to obscenely enrich a few in the short term.

The Growth Economy for its own Sake has paved our farms, sprawled our towns, and destroyed the quality of live of our people and their children and grandchildren.

The Growth Economy for its own Sake has convinced us, for more than a century, to ignore the voice of scientific knowledge and reason in order to continue the acidification of our air and oceans through Sulphur Dioxide, Nitrous Oxides and Carbon Dioxide emissions from combustion of fossil fuels.

The Growth Economy for its own Sake has exploited the labour of people and environments that have no protection from ill use, and has persecuted people who worked for economic justice and equality.

The Growth Economy for its own Sake has assaulted the morality of our youth and treated them as a target market rather than with the respect of future citizens and community members.

The Growth Economy for its own Sake has corrupted the purpose of our governance and civic institutions, it has usurped the purpose of our curiosity and research efforts, and it has shifted the motivation for the education of our young from development of their intellect and character to exploitation of their labours for further growth of the economy.

We, therefore, the Representatives of the Transition Committee of Oamaru, Assembled, appealing to the Supreme Judge of the world for the rectitude of our intentions, do, in the Name, and by Authority of the good People of this community, solemnly publish and declare, That this Community is, and of Right ought to be Free, that we are Absolved from all unsustainable and perverse requirements of the Growth Economy for its own Sake, and that all connection between this Community and the Growth Economy, is and ought to be totally dissolved; and that as a Free and Sustainable Community, we have full Power to reduce fuel and electricity consumption, restore our environment, protect our culture, nurture our agricultural assets, set aside our resources, refrain from extracting minerals, stone or fossil fuels, conclude Peace, contract Alliances, establish Local Commerce based on our own principles and theories, and to do all other Acts and Things which Sustainable Communities may of right do.

— And for the support of this Declaration, with a firm reliance on the protection of Divine Providence, we mutually pledge to each other our Lives, our Fortunes, and our sacred Honor.

Emotional story that comes with the text can be found on transitionculture.org.

24.11.08

Economy worth then great depression.

Its week of fALks favorites reapearing. My favorite mathematician - yes I have such a thing - is Benoit Mandelbrot since the day he coined the butterfly effect and made me play endlessly with mandelbrot "Apfelmänchen" on my Commodore Amiga 2000 endlessly applying new formula alterations and watching the 12 MHz 7,09 MHz processor due its duty (yes it was slow). Said man is the mentor of Nicholas Taleb - an author whos book "The Black Swan" I almost got once but then didn´t and since then regret not having read said book as it was forecasting the current and all future financial collapses. Good thing for boingboing for linking an interview with the two men explaining in a bit too little but good enough detail why the current crisis is so scary - yes maybe the sack of rice falling over in China had to do with it in the end. Oh yes the greatest mathematician of modern times says that its quite possible that we are in a worse position then in the Great Depression.


Some info from Wikipedia about the book (linked above):

Why do people tend to neglect rare events? Partly because humans underestimate their ignorance in most situations—the effect of unexpected events is far more significant than people often imagine. Taleb argues that the proposition "we know" is in many cases an illusion—the human mind tends to think it knows, but it does not always have a solid basis for this delusion of "I know".

Taleb also questions the authority of experts. The "truth" behind science is limited to certain areas and methods, and in many areas having an academic degree and presenting oneself as a scientist is irrelevant. Indeed, authority can stifle empirical experience which, so many times, has proven to have a sounder base for accuracy.

Extreme events do happen and have a big effect. Examples abound, including September 11th. The Internet with its various effects was scarcely anticipated, and it is a development that has had a significant effect. The effects of extreme events are even higher due to the fact that they are unexpected.

8.12.07

War Is Over - if you want it

WIOt.jpg27 years after the death of John Lennon people seem still not want it. Yoko Ono has a very lovely letter on the Imagine Peace website together with a great moving impacting video. Do you want war to be over? Boykott Companies that produce weapons, don´t vote for leaders that are into militirization and fear mongering, be positive.

Imagine Peace

24.03.07

Web Development in 2007

Or: Does Internet Explorer suck intentionally?

I have just completed a small webproject — nothing life changing, nothing fancy, nothing ridiculously cool and generally a very very small project — perfect to fiddle around with some technology that I wanted to employ for a long time. Me — not having done semi serious web-development since I redesigned this blog — was curious on a number of questions especially "What is wrong with the Internet Explorer and why is it wrong?" but we get to that way way later (the most interesting part but the rest leads to it) first was some basic fact checking:


"Whats the backwards compatibility path used these days?"

Browsers change, get recoded, dissapear, reappear and fight with each other — thats generally the situation each and every webdeveloper on the planet faces on it first website. There is no way to support every browser on the planet in every version number ever released — period. Everyone who looks at Internet Explorer 4 Mac Edition and makes a website work on that the same way this website works on a Firefox 2.0 installation is getting a lifetime "you waste your time" award.
Generally I approach this question with a very certain bias. First and foremost I want to push technology to the edge but because — as stated here before — only used technology gets pushed forward and gets redeveloped, reused and generally more useful.
But I am sidetracking. So being in the beginning of 2007 someway into a new fresh awesome millennium full of (sometimes) exiting technological advancements how far does a normal "mom and pops" website that needs to reach a broad general audience across the full spectrum of age and technological knowledge needs to push it with webbrowser compatibility. Since this websites intents to sell you something it needs to work — easy clean simple and perfect. Now if we look at browser statistics these days (which I don´t believe any of them, but I generally don´t believe statistics that I haven´t come up with myself so that point is mood) the field is WIDE and open. General consensus is that there is a total four browser engines (as for computer web browsers that is — more on that in a minute) on the market that are worth looking at.

1. Internet Exploder
2. Mozilla/FireFox/Gecko
3. Safari/KHTML/Konqueror
4. Opera

For me personally and for this specific project one browser falls out right away from consideration. I am really sorry my open source, we make everything different then the next door guy just so we are different and are percieved cool friends — Opera is a nonevent for me and I would venture to guess about 99% of webdevelopers. Yes according to statistic Opera has a whopping 1.5% market share. I meet only two persons in my whole life that use Opera and if those two persons (totally unrelated to each other) give a picture of the Opera using population then its save to say that they are technology savvy enough to figure out that when a website doesn´t run it might be the browser and I am sure they have 10 other obscure browsers on their machine to access the site. That goes also to the Opera Team — if you want your browser to be used make it 100% CSS/HTML/XML standard compliant 100% Javascript/DOM compliant because the webdevelopers have a life and really there is enough problems to fix then looking at every obscure "me too" browser on the market. I really do love and try to support small software rebells but my prior experience with Opera was so BAD (in development terms) that I am absolutely sure ditching this will not cause any ripples in the space time continuum and give me at least 10% more time out of web development to rant here.
With this out the way you dear reader might say: "Hey but what about Safari/KHTL its similar obscure and has a similar small market share." Yes, dear reader on the first look that might seem so, from personal experience I can name about 100 ppl (compared to the operas two!) using Safari daily, because it comes with a pretty more or less widely used operating system called MacOSX and as it is with these bundled apps — some people think they have no choice as to use them. The great thing about Safari — besides being bundled and forced down mom and pops throat — and totally unlike Opera (never used 8 but used 7 and 6) its about the most standard conform browser on the planet — yes even better then Firefox. Its even recommended as a standard reference platform (where ever I read this if I find I post a link). So even with a tiny market share that I personally would think is really at least five times as much as in the "statistic" the developers of KHTML/Konqueror together with the enhancements of Apple did something Opera has utterly failed — eliminating the need to specifically test for this platform — when you adhere to the standards set by W3C you can be 98% sure that it runs, looks and works beautifully. Thats in Javascript/DOM, CSS, XML, XHTML.
Another great thing about it is, that its automatically updated (the Safari variant — Konquerer users prolly are also very up to date as Linux users are in general) with the system so you can be sure that most people using Safari are also on one of the last versions. So testing is constraint to 2.0 and onward.

Moving up the percentage ladder we approach the burning fox. While I was not very impressed by the early siblings (first Camino f.e.) Firefox is now a stable mostly standard conform plattform and with the FireBug Plug-In has become my main webdevelopment choice (this might change with OSX 10.5 but I can´t say anything — NDA). So its clear that my developments will work out of the box with Firefox and hopefully all Gecko compliant browsers. So how much versions back you need to test? I don´t test anything before 1.0 because before 1.0 most people using Firefox are assumed intelligent fast technology adopters and they prolly have the latest version of Firefox installed. Actually I am not even testing on later versions then 1.1 at the moment because I think the majority will only upgrade to X.0 releases and they hopefully didn´t break things that were working (and you can not under any circumstances be responsible for any and all nightly builds there ever are for any open source browser anyway).

With those out of the way we get to a point where things are starting to get ugly. This point is the internet venom called Internet Explorer — or nicknamed by about every serious web developer: Internet Exploder. In case you have not heard of it — its the browser that Microsoft "pushed" out the door in the mid 90s to combat Netscapes dominance in the early internet. Its the browser that started that browser wars, killed off Netscape (temporarily) and has since earned Microsoft a couple of Antitrust lawsuits and A LOT OF HATRED among web developers of all kinds. The problem is: Microsoft won that browser war (temporarily) and the antitrust lawsuits have done nothing stopping the bundling of that browser on the most used Operating System in the world — namely Windows that is. So with about 60% browser market share as of last month (if you want to believe the linked statistics) it has more then double of Firefox and just can´t be ignored no matter how much you swear. Now all this would only be half as bad but those 60% are quite unevenly distributed between the three main version numbers 5.0, 6.0, 7.0. And looking at the individual percentages, each has more then double the percentage of Safari so you better support em all. Heck I would even throw in 5.0 Mac edition for the fun of it because I have personally witnessed people STILL using that! Now a person not experienced in webdesign might say "hey its all one browser and if you don´t add any extraordinarly advanced function 7.0 should display a page like 5.0 and everything should work great.
Well without going any further down a well worn path I can only say this: It fucking doesn`t. If you need to support people using Internet Explorer you need to go back to at least version 5 including the Mac Edition.
Now if Microsoft would have tried to support web standards like they are kinda set in stone by the W3C this would all be only half a problem. Microsoft has chosen to go down their own path and alter little things in the W3C spec — mostly known is the box modell difference in CSS.
(I am going to get inside that in a second — just need to find a way to round up this section.)

What I haven´t touched yet — because a clear lack of experience — are phone and other device browsers (gaming consoles). For this specific project this was no issue as I think the people using a phone to order a highly specialized documentary DVD is close enough to 0. Gaming consoles are definetly not the target group of this DVD either. For up and coming sites out of this office I will clearly look into the "outside of the computer" browsers and will surely post my findings here — generally I think they all going to move to an open source engine like Gecko/KHTML sooner or later (iPhone will drive that, Nokia already decided to use KHTML f.e. the Wii is using Opera - tried browsing on the wii and it sucks bad compared to all the other stuff thats beautiful on that machine).

To round this up: If you want to reach the mom and pop majority of the Web I concluded you have to test on Internet Explorer back to version 5 (including Mac Edition), Firefox 1.0 and upwards, Safari 2.0 and upwards. You also may want to check your site with a browser that neither does Javascript, Pictures or anyhting in that regard to make sure your site is accessible for the blind and readable by machines (google robots f.e.).

Now with that out of the way the next question formed in my head:


What content management system should I use?

While this specific project has little content updates, it still has some (adding new shops that sell the dvd to the reseller list f.e.) and more important — it had to be deployed bilingual. So both of these consideration prompted me go with a CMS. Now I don´t know if anyone has looked at the CMS market at the moment — I have done some intense research (also for a different project) and its quite a mess. There are basically two types: Blogging centric and Classic CMS centric and a lot of in between breeders.
Since I don´t want to bore you too much: most of the Open Source CMSs can be tested out at the great site opensourcecms.com.
Personally the only CMS I have used (and I am still using for various reasons, but basically I really do like it) is the blogging centric Movable Type (not open source and costing some dough). But Movable Type is so blogging centric that doing anything else with it is counter productive (but can be done). So me — feshly in the CMS game with knowledge that "blogging centric" is not something I want here — looking at all the options found out that its very hard to decide on one from pure looking. The user comments on opensourcecms.com are very helpfull already in siffing out all the once that had pre beta development status. Left over are basically the big three CMSs Typo3, Mamboo, Drupal and Plone. All with their own good and bad sides. The one from pure technological standpoint and feature wise and stability wise that I really really liked was Plone, but Plone depends on Zope and for Zope you need a very heavy duty server that runs just that — I don´t have one. The learning curve for Typo3 seemed much too high for me — thanks I am already used to PHP/Perl/XHTML/Javascript/CSS etc. and I have no need to learn another obscures description language on top of that just to use a CMS.
This left with Mamboo and Drupal as the likely choice. Mamboos architecture seems dated and is at the moment in a state of flux and recoding — I do not like unstable software and have no need to beta test more warez then I am already doing — so mamboo called out. Drupal came out as the winner in the CMS field at the moment — but NOT SO FAST my friend. I installed it used it. It has a flashy web2.0 interface with lots of usefull functions. Well there are SOOO many functions that would never been needed for that project. Also it is very heavy on the server (and I had no urge to get into the 200 page discussion on optimization techniques for drupal on their forums) in the default install. It became clear that this CMS is not the solution, the only function that I was really looking forward too was including Quicktimes and MP4 in an easy way. It turned out that including these is easy — having them show up in the way I like it and not the Drupals developers vision of "another tab in a tab in a window in a frame" proofed also extremely difficult.
Now this left me with either going with a fast hardcoded website that I would need to maintain the next 5 years or dig up a CMS that I used before and had almost forgotten about — x-siter.
This CMS is written by the fellow CCCler Björn Barnekow and is totally unsupported in any way other then "ask him and he might reply". The beauty of it is that it is ULTRA lightweight — he himself describes it as the smallest (code wise) CMS out there. It is totally PHP and even if you have no plan on PHP its very very easy to understand what it does. From an enduser perspective who needs to maintain the site, the approach is unrivaled and beautiful, because you just edit paragraphs of text, add pictures etc on the page they belong to. So no fancy metastorage system for all the pages or folders containing the pages you edit it right inside the individual page. Now this has a huge advantage if the person you need to explain how to update the site is you general secretary or such because she browses to a page she wants to change then logs in edits it and logs out — its very very close to WYSIWYG editing and very easy to explain to everyone.
The layout possibilities with x-siter are also well thought out, giving you an adjustable typographic raster that you can put pictures in or text etc. A very nice approach. So why is not everyone with a small website using x-siter then and why has nobody heard of it? Well first of all its more an internal tool for Björn that he shares without any official support and not much documentation inside the code either. He thinks you might not need to touch much code and generally he is right, sadly design realities are different and I have a concept of how a website needs to look like in my head and if the CMS tries to get me to do it differently I rather adjust the code of the CMS then adjust the look in my head. And this is where the x-siter shows big weakness because the code is not very modular and not very good commented so I had to change quite a few stuff and now the code can not be easily upgraded. But generally if you need a very fast small site that needs to be updated now and then x-siter is definitely worth looking into. Even one Max-Plank-Institute switched from Plone to x-siter because its soo much faster and actually scales quite nice and has a good user account management. So it does lack some advanced things from Drupal, generally I do not miss those feature (and most can be added through different things anyway).

So I employed x-siter and heavely modefied it to get my certain look that I wanted (taking some headlines out of tables and into css divs etc). Since the site is pretty simple I needed an excuse to implement at least one advanced cool dynamic function in there.


What cool "new" "web 2.0" technologies are really cool and worth testing out and are generelly usefull for this small website?

Well I couldn´t find much. First I thought I rewrite the whole CMS to make it fully dynamic (without any full page reloads — imagine that) but thanks god I did not go down that route. There was one function of the page though that definately needed some design consideration. Its a function that on return kinda breaks the whole look and feel of the full app by generating an ugly error/confirm html page.
That function is a simple formmailer to order the DVD. Now this function — needless to say — is also the function that is most important on the whole page.
So my thinking went down the route of "Hey I want to let the return page of the formmail.cgi replace just the "div" of the form. If there is an error I should be able to go back to the form and correct it (without having to completely fill it out again)."
Great thats a simple AJAX request to the server and putting a returned HTML into the DOM of the current page with the option to return to a saved state of the DOM. YIPPEEE some coding fun — or so I thought.
Generally implementing this with FireBug and SubethaEdit was so dead easy — almost scary (look at code on the bottom) easy. Here is how I did it:

First I replace the normal form button with a ordinary "javascript:function()" link button inside a Javascript below the normal form button. That ensures that people without javascript still can order the DVD through the normal nonAjax/ugly/unintuitive way of doing things in the last millenium. Those people get a normal submit button, while people with Javascript enabled get the AJAX button because those people should also be able to use the AJAX functions.
So user fills out the form and hits the AJAXified "submit" button. The formdata is then send over a standard asynchronous connection through a standard xmlhtrequest. At this point you already add browser specific code but this has been figured out and the code you add JUST for Internet Exploder is already 3 times as long as the code would be if that browser would function normally.
Anyway the data is then processed using the normal nms formmailer.cgi. And this returns a (sadly non XML) html page. I then parse this HTML and look if its an error or if its even a server error or if the result is an "ok" and then let specified output for each case drop into the DOM if the webpage in the only correct way (which is NOT innerHTML!).
Before I exchange the data of the webform with the result page I save the webform with all its content in a DOM compatible way use cloneNode. (its just one line of code I thought!). So if the result is an "ok" I purge the stored data and tell the user the order is beeing processed and this and that data he has send. If there is an error there is a javascript link in the result page that when clicked on exchanges the result page with the form and all its content.
So far so good. This part coding with learning the whole technology behind it took me three hours.

So the website looked great the functions worked as expected and since I am hyper aware of the CSS box model issues of Internet Exploder it even looked great in Internet Explorer 5 on the Mac. At that point — a point of about 20 hours of total work (including digging PHP of x-siter!) — I considered the site done.
BAD idea.


Problems with Internet Explorer 5.0, 6.0, 7.0

First thing I noticed was that IE 5.0 Mac Edition does not support xmlhttprequest AT ALL, also it does not do any DOM. That made me aware that a very very few users might have browser that a) has javascript b) but does not support any of the modern AJAX and DOM functions.
Well that was easily fixed by trying to figure out in the very first called script (the one that replaces the submit button) if xmlhttprequests can be established. If not its the ugly normal nonjavascript version — if yes then the browser should be capable of getting through the whole process.
Again a sigh of relief from my side and 5 minutes of coding later I closed up shop for the day. The next day would just be some "routine" IE 6.0 and 7.0 testing on the windows side and then the page would be done — so I thought. I was very happy because I had very portable future proof code that was small and lightweight and easy to understand. There wasn´t even a bad hack in there except for the xmlhttprequest check.
Opening the page on the Windows PC with Internet Explorer 7.0 my heart dropped to the ground like a stone made out of lead in a vacuum on a planet the size of our sun.
Nothing worked. The layout was fucked (I am quite used to the CSS box model problem so I avoided that in the first place!) and the submit thing did not work at all - clicking on the button didn´t do shit.
The CSS was fixed after fiddling with it for two hours, there seems to be a "bug" in IE 7 with classes vs. ids and z-depthes. Fixing the Javascript was much harder. I couldn´t use FireBug to get to the problem — because in Firefox the problem didn´t appear. The IE7 debug tools are as crude as ever (a javascript error console, which did not produce any errors).
So I placed strategic "alert"s in the code to see how far it would get an what the problem was. It turned out the first problem is that it can not add an "onclick" event listener to something I changed inside the DOM after the DOM was drawn (Add 3 hours to the time total). I struggled for a solution and rummaged the web for any clues. It seems that IE up to version 7 (that the current one!) can not do "setAttribute" as the W3C says but instead you have to set every attribute through (object.attribute = "value in stupid string format";) so for a link its (object.href = "http://your link";) instead of just putting it all in through object.setAttribute("attribute", "value");
Now if you think you can also add an event listener this way by doing "object.onClick = "function";) forget about it. It seems through extensive testing (2+ hours) that there is currently absolutely no way to add an event listener through normal means to an object that was created after a webpage was build — in Internet Explorer that is — again Firefox and Safari this works wonderfully. So my solution was to use the "fake" onclick through a href="javascript:function();" thanks god that this existed otherwise I would have been forced to either write 200 lines of codes for custom event notifiers (which I am not sure that they would have worked) or abonden the already working approach alltogether.
If this still sounds too easy as to call a serious deterrent yet — this was not all that solved the problem. Because creating an object and then before writing it in the DOM setting the href attribute to the "javascript:" also does not seem to work in Internet Explorer. I had to actually write it in the DOM pull the object ID again and then change the href attribute. This doubled the code for this portion of the javascript.
Now the problems were far from over yet. As you might remember from earlier I save the part of the DOM containing the form with content so I can get back to it in case there is an error with the form. This worked great in my two standard conform browsers. So I am at this point where every important browser can get the form mailed out. If you have an error you get the "back" button which through the object.href="jvascript:" way I could also make work in IE. Now I was sweating that this whole "cloneNode" might not work in IE and I would have to parse the whole form and rewrite it, but clicking on the now functioning button did in fact work as advertised (by me) and I returned to the form. But the trouble was far from over because now my "submit" button didn´t work again. Imagine that — I am cloning the DOM as in CLONING that means I take each object with all attributes and values and CLONE IT into memory. Then when I put in this CLONE this should be just like the original — well it is in Firefox and Safari. Internet explorer seems to choose which kind of attributes and values it clones and which not because the values of the field had been cloned and correctly copied back, yet the href attribute seems not clone worthy and is completely forgotten. At that point I had been sitting a full day at debugging IEs Javascript/DOM implementation. So on the third day I just made the dirty "grab the object again and change the href manually" hack and let it be.
In general I have a recorded 27.4 hours of development/design for the whole page including php scriping (that I have no clue off) and 13.6 hours of IE CSS/jacascript debugging (where I have a very thorough understanding). My code bloated 1/3rd just for IE hacks and is less easily readable or manageable in the future. And it definitely runs slower, not that you notice in this small app but extrapolating that to a huge webapp like googles spreadsheets I think the speed penalty is mighty.


Why is Internet Explorer so bad?

Throughout that project (and many before) I have asked myself this question. I am no programmer and have no degree in computer science, yet I think that it can´t be THAT hard to make a good standard compliant browser. Microsoft had about seven years time (yes SEVEN YEARS at LEAST) to make Internet Explorer standard complient, listen to the complains of the million of webdevelopers and redo the whole thing from scratch — in the meantime Firefox has made it to version 2.0, Safari has made it soon to version 3.0 and even a small shop can churn out a browser with version 8 — even so its not perfect Opera is still MUCH better then IE ever.
Now Microsoft is swimming in money and has thousands of coders in house who all probably are a million times smarter then me when it comes to programming. The bugs that are present are enormously obvious and waste millions of hours of webdevelopment/design time. The box model problem should have been just adjusting some variables and the javascript engine — well after a rewrite of the program with native DOM support (at the moment its a hack I would say at best) all the problems should be gone.
Now Microsoft had the change of fixing this with Internet Explorer 7 and while transparent PNG support (a hack) and finally a fix of the box model problem (also not 100% fixed so I heard) has been touted as breakthroughs in supercomputing or something the whole DOM model Microsoft uses does not work (and they admit that on the IE dev blog — promise to look into javascript/dom for 8.0 — in 10 years). That at a time when the whole web wants to use DOM model stuff to make rich web applications with great consistent interfaces. I have looked into some of the AJAX make my webpage move and look all funky frameworks and I can tell ya -> they have the same problems as me and sometimes more then half the code in them is to get around IE limitations — which slows them down to hell I guess.
So IE 7 is almost a non event — I am asking now even louder WHY COULDN`T MICROSOFT FIX THOSE THINGS.

First my personal take on this — they have no clue — this multibillion dollar company has no idea why someone would want to consistent interface on a website that doesn´t reload just because a number changes some were down in the text of the webpage. The reason I think that: Look at Vista. Vista is flashy, has an inconsistent interface (I just say 10 functions to shut your computer down!) and uses every processor cycle available on your CPU and GPU just to run itself (so much that not even cheap modern laptops can run the full version flawlessly and fast). So when they not realize that this is important for themselves why would they realize that these are important concerns for developers outside the Microsoft bubble.
Now pretending that a multimillion dollar company is too dumb to make such assumption is probably as dumb as to think that the Bush administration has no advanced hidden plan on what they are doing (well you know as with microsoft — they could be just plainly dumb or have some greater goal that nobody fully understands or if they understand are not having a loud enough voice to make it public).
So since we are not dumb over here we stumble upon ideas why this is all the way it is. The best take is by Mr. Joel on Software in his blog entry called API Wars. Joel is a software developer writing Bug tracking software for Microsofts Operating Systems. He is very known and very respected by the developer industry and his sometimes controversial statements cause quite a stir now and then but he — being very much inside the OS and probably able to read Microsoft assembly code backwards — is most of the times right spot on. In the linked article he talks about how Microsoft has always been about protecting their main treasure — their API. The Office API with the closed document structure the crown jewel above everything else. Well he also talks about how they have so many APIs that developers are getting confused and since most of the APIs are for sale developers nowadays turn away from the APIs Microsoft provides and are turning TO THE WEB to develop especially small applications — the reason most shops are on windows is believed to be exactly those simple small application only available on windows + office.
Now I said "the web" — the web has the notority to run on any OS somehow and manages to circumvent Microsofts monopoly in a nasty uncontrollable way — poor Microsoft.
Now you probably understand where Joel and I am are getting too — yes Microsoft does anything to stop the web from spreading too fast and get too usefull before they haven´t found a way to completely control it — and they try so hard, DRM, TCP etc etc are all designed to control webcontent — good thing that webapps are not content and Microsoft is acting too slowly. When you read through Joels entry you get quite a clear understanding that Microsoft is not interested at all to make webdevelopment easy and the longer their Javascript does not work and their interactive DOM Model does only work in some strange emulated way (change something in the DOM and look at the HTML source — you will not see you change reflected there) the longer they have a monopoly — and that their monopoly is threatened by webapps is apparent by Googles spreadsheets and word app — sadly for Microsoft these already run even on a broken IE 7 (poor google guys who had to debug that).
I do see that Microsoft will try to "fix" these problem — because a) this gets into a publicity backlash that not even Microsoft can hold out against (the BOX model problem circulated for years but then gained so much steam that Microsoft HAD to do release a press release stating that they are working on IE 7 that adresses some of those things — that was three year ago or so). Because so many developers/admins/techddicts/etc. suggested to friends that using Firefox is more secure (thanks god IE has had soo many open doors for viruses) that Firefox usage exploded and is now eating slowly but steadely into the IE market share. Now Microsoft faces a very big problem — a problem so satisfying for webdevelopers that I would sell my kidney to see this unfold (maybe not but I would hold a party just in case of that event). Microsoft understood in the mid 90s that if they have large enough a market share they can force any competing browser out of the market by introducing proprietary standards (ActiveX, own implementation of CSS and many more) because the webdevelopers are going with the browser with the biggest market share. That worked quite well and played out like they intendet — almost. The webdevelopers are a strange bunch of people and some are using Linux and other evil stuff so they ensured that big companies tried to stay cross browser compliant (I am one of those who wrote about 100 emails to big companies telling them their website doesn't work on browser xy and the millions others doing that are responsible that we do not have ONLY Internet Explorer today — Microsoft really almost completely won — this was close). Now back to the point and to the happy future lookout. If Internet Explorers market share would drop below — lets say — 10% I am the first person who would drop Internet Explorer support completely and rather debug small things for Opera in addtion to the two nicely working Firefox and Safari browsers. My thinking is that the hatred for Internet Explorer among webdesigners/developers has grown to such proportions that I would NOT be the only one — heck I would even say this browser fades from visibility in less then a year. This would satisfy so many people who have lost so much time in their life just because some stupidly rich multibillion dolllar company wanted to get richer on their backs. Now this is not unfolding right this minute but I can see that if they wait too long with IE 8 and fixing the aforementioned javascript/dom problems this might go faster then everyone think. The webtime is a hundredfold acceleration of normal time and a monstrous creature like Microsoft might not be able to adjust fast enough — one can really hope.

22.06.06

Record only the important things in your life

gaze_detect.jpgThe dream of many media gangstas is it to have a constant record of the world surrounding, visual as well as audible (and possible in holographic quality). This "Feed" has been featured in quite a few science fiction writings (gibson etc pp) and as beeing heralded as the ultimate truth tool it is also a privacy fear device par excelance. Even if a camera inside your eyes and a direct soundfeed straight out of your brain would be feasable you still have that enormous amount of data where only a fraction would be interesting - sleeping would just produced a noozle sound and black pictures after all - and you sleep at least 1/2 of the time during the day (daydreaming included). This part of the problem seems to be solved by the Japanese Manabe Hiroyuki working for communication giant NTT DoCoMo. The device trains itself by looking at your attention peaks. Means when you daydream the device turns off an audio/video recording and saves the environmental setting as preset - so when you next encounter a similar situation it turns on automatically without consulting your brain. As anyone can see - this will probably stay away from massmarket at the moment. Original story found at engadget.

5.06.06

One half of a Manifesto

Self Evolving Technology is hold up by bad software?

The One Half Manifesto (PDF warning) by Jaron Lanier appeared on my screen while looking for "Nonlinear Storywriting" at google and while I have been reading Ludity statements about the assertions that Mr. Kurzweil makes there never seemed to be an informed comment against the technology religion that spreads like a wildfire around the world these days. While I am a supporter for Kurzweils immediate drastic change in society through technological advancements I have been sceptic about the paths he draws out that lead towards the change and think that the problem this would cause in humans psychology would probably be so overwhelming that it would cause the collapse of society in general and negate any technological advancements we might have.
Jaron Lanier seems to think so as well - again with the normal "I am american and my believes are the utter truth" attitude that I already hate in Kurzweils dissertations - he defuses lot of the predictions that have been made about the future from Kurzweil and the like. His most objective comment and one that I haven´t thought about but that plagues me every day is Software - or better the sad state that software is still in despite the technological advancements we had in the past years. And when I type on this computer (not state of the art but fast as hell compared to 15 years ago) I have to agree. Not even on a 1 Mhz KC85/III has a computer lagged when I was typing something - now in the year 2006 I have lots of occasions where the computer displays the text that I am writing with a huge delay. Of course I have about a billion of windows open and run tons background processes but also the computer is about a million time faster then when I had my first computing experiences. The problem seems that software is not scaling as well as the hardware. This he connects to evolution in saying that evolution is also slow - even if parts seem to evolve fast - there is never a revolutionary step in evolution - and evolution is what the techno pundits hold up as their biggest motive - that technology one day will become more powerful then humanity - more intelligent even - and that technology is an extension of evolution. So plagued by overheads too much data and too little solutions to solve that the hyper expansion of technology seems to be slowed down. Yes there will be drastic change in the future but it won´t be autonomous and it will be controlled by humans.
Then again - as is my belief - that a stupid mistake or strange coincidence might make all the difference - as is the case in evolution as well. I do not understand the "humanity will be doomed" approach that Mr. Lanier is taking in his paper as I go with Mr. Kurzweil on this issue saying that if there is technology that could wipe civilisation off the map there is also technology (or information) on how to prevent that and it would create a balance as there has been a strange balance all throughout history keeping us from our own destruction.
So in the end the "One half of a Manifesto" is exactly that - one half - the other is uncertain and only time will tell if we ever reach a point where we need to worry about such things. Hilarious is his remark about the future beeing a blend of the best of socialism and capitalism because 95% of the world population will work on help desks that try to fix software problems - I can clearly follow him on those lines - software sucks and this will not change as long as humans write it.

14.03.06

US Army: Peak Oil is imminent

The "Peak Oil will win the bet for human self destruction" camp has a big fan. The Energy Bulletin says that a public document (PDF 1.2MB) by the US Army reveals that the biggest military institution in the world is fearing that the constant flow of fresh black gold will soon start to stutter. The US Army is a huge "Oil Addict" with its millions of vehicles, factories and camps running on the premise that the liquid comes in cheaply. The strategic report reveals that the oil industry has been hiding the fact behind "skewed" statistics. The "official" version of the Oil Industy is riddled with mistakes and misleadings. For the "official" projected Peak Oil point in time the exploration of new oil fields would need to increase five fold - in the last years it has steadily been declining besides new technology.
Now with the US Army basically saying exactly the same thing as the Peak Oil "advocates" there is still no big media research or reporting. I mean I understand that they are all lining up behind the power elite - but Peak Oil is something everyone in a developed environment should at least have a thought about - and especially those that are powerfull today should fear the energy crisis as the world would not be in social order as we know it today anymore after the oilprice surges to something like €250 per barrel.
The positive effect is also shown in the document of the US Army. Apparently the report suggest that the Army should go "energy efficient with less carbon fuel and green energy alternatives as soon as possible. Fear can drive innovation.

4.02.06

When Geeks make Fashion

MITMediaLabFashioniPoddisplay.pngMITMediaLabFashionHead.pngthe outcome is neither wearable nor very inspiring nor very futurist nor groundbreaking - even if the show is put on by the MIT Media Lab.
An huge LED display that must weight a kilogramm to display what you are listening on your 10 gramm iPod Nano. A helmet to protect you from sound and head injuries that had to be hold in place by the model otherwise it would fall off her head causing her neck to break.
The coolest and best functioning piece is a midi jacket - originally developed in the 1970s - the 2.0 version displayed at the show had a malfunction allowing only the armslider to work...
The FM radio jewelry looks like a FM radio headset with lots of wires coming from the jewelry - didn´t really get the point on this one - an iPod nano with a FM headset would have been more invisible then this stuff...
What seems a little gross is the living jewelry - you stick a piece of clear plastic on your skin and then an ornament grows out of it.
The bag with a lightcode that shows if the wearer likes other people fashion, the shoes that fold out the faster you walk or the number pattern scarf are all very uneventfull and won´t advance civilization a lot.
Clothing as interface between body and environment sounds cool - the accompanion piece looks like a windfan worn on a body or a scarf with a build in gasmask that makes the wearer blind, or a ghettoblaster fitted into an expensive woman suitecase or a weddingdress with a built in iPod - why how or what - I have no idea...
What gives - do not let geeks near designing clothing - it just ain´t work - let them develop next generation materials for real designers to use - but designing those geeks can´t sorry. And yes the lighting setup in the MIT lunchroom is worth mentioning - because its really really bad - great sponsor they have no need to give them five minutes of airtime thanking them. Please MIT use your geniuses to advance civilzation on a technical level and leave the designing and putting on a show to those who are capable of it. Very obscure.
Have a look yourself if you have pesky RealPlayer.

2.02.06

Google between God and Death

An interesting article on CNNMoney that is talking about the future of the Search Engine Emperor that we all have that addiction to when it comes to expanding our mind with more or less usefull information from THE net. They interview Ray Kurzweil and a couple of other researchers, futurists and bankers to foresee the future of google. And it goes like this: Either its becoming the evil Media Mogul, or it becomes the Emporer of the Internet, or it will become God once we are ready to upload our soul into the GoogleCollectiveBrain, or it will simply die under its own weight. I would say after seeing the recent discussions on how they do in China and how they would bow down to any government other then their own the active netizens are already becoming aware of the Google gatekeepers power and only one small solution somewhere that would rival the search engine - which is still the core of Google - would take them out immediately leaving shareholders running away as fast as possible leading to a total crash. My bet is here on the distributed search engines like YaCy taking the google crown - especially in the fascist american reality.

21.01.06

Be Informed: Peak Oil and its ramifications

One of the best researched articles on Peak Oil ever is found on the independent. It is written by Jeremy Leggett a former geologist for big oil companies looking throughout the world to find black gold. If you want to live for the next 10 years and be prepared of what there might come I suggest taking the half hour to read through this informed in-depth analysis on our near future. Leggett goes through length to dismantle the current estimates from big oil corporations as to when the Peak Oil might come. He shows that not only Peak Oil is a problem but also that overproduction - as seen in one of the biggest oil fields on earth in Saudi Arabia - might shorten the life span of our oil reserves considerably. He even slips into the shoes of the big corporation estimates and shows that even with that nice estimate we are in for total crash of our current civilization by the year 2011 - thats in five years. He shows that the dependency of the west will be challenged by the energy hunger of China that will not stop short of global military confrontation with their one billion people. What really convinced me is the example he gives from the past where industry estimates turned out as big lies - namely the estimate given for the US oil field peak in 1956.

In 1956, a Shell geologist called M King Hubbert famously calculated that oil production in the "lower 48" states of America would peak in 1971. Almost nobody believed him. Shell censored the written version of Hubbert's address to the American Petroleum Institute, changing the wording of his conclusion to read that "the culmination should occur within the next few decades". The US Geological Survey, in particular, did everything it could to hike the estimates of ultimately recoverable American oil to a level that would make the problem go away. The US had 590 billion barrels of recoverable oil, the survey said, in 1961, meaning that the industry had 30 years of growth to look forward to.

The years went by and the "lower 48" did indeed hit their topping point. It came a year ahead of estimate, in 1970, at 3.5 billion barrels. Since then, production has sunk down the second half of the curve at a steady rate.

An exact dupe of what seems to happen now - especially in the light that Shell has already lowered their estimated oil capacity by 20%. The article goes further and is an informed read - apparently there are more people like Leggett who come out of the foggy big oil industry countering the claim for the most optimistic peak oil scenario and I tend to believe them. In the end of the article Leggett gives an example of an english town that made itself self-proficient and cutting carbon emission by 70% and external energy need to one quarter.

What they don't want you to know about the coming oil crisis

7.12.05

Sending an email to the future

Ah it feels good to let the future know you existed. FutureMe.org is offering a service where you can write someone (yourself or anyone else) an email that is then send at specific date that you can specify - up to 20 years into the future. You can make your email public for someone else to muse about your writing or make it completely anonymous. I just did today - and its private sorry.... :)

6.12.05

The World between Peak Oil and Singularity

The day when everything changes seems to come closer and closer if you want to believe one of the "world as we know it" ending scenarios. and there are quite a lot of them out there next to Nostradamus and Marsians taking over the two most prominent and believable? Peak Oil and Singularity and in between them selfdestruction.
Peak Oil is gaining momentum in a plethora of on and offline communities - its the scenario that is based on the fact that in a couple of years (people estimate at most 15) the world is running out of oil and therefore its base for prosperous survival and capitalistic growth. A scenario were mankind would suffer the consequences of the overuse of one of the most valuable assets of todays earth by burning it just to move around. No more plastic no more cars no more transportation. The scenario is one that will gradually lead to a new society that either adapts in time on a grand scale and therefore has little impact at all or will disrupt western lifestyle totally and lead us back to a more earth bound life without lots of technological advancements with some superrich people maybe hanging in there extremely long and grabbing the world power.
The other interesting option is Singularity - the event when one or more or all technological researches produce something artificial or biological or subatomic or ? that is better then the brain of todays humans. There are some rivalries if it might be the nanotechnology, the advancing field of nanobiotechnology or artificial intelligence on normal silicon based computers. So an artificial intelligence or a highly upgraded human intelligence will spin the invention wheel faster and faster leading to better and better AI or upgrading or more and more nanocomputobots reinventing itself basically on an exponential curve hindered with only a lack imagination - maybe. This scenario would give us a world we can not comprehend right now with everything at stake - maybe even intergalactic space travel or the like. The timeframe for this interestingly enough was thought to come around 2035 (30 years) but has been gradually reduced in the last years due to the already happening speed up in technology and cross breeding technology to about 2015-2025 so again around 15 years from now - if you are being conservative (some people say this might happen before 2010 - I tend to ignore them).
Besides some institutes and communities rallying around those ideas it seems that the major powers of this world and the masses in general completely ignore those propositions of the future of mankind and take on the course we have been in since our existence - study, conquer, develop - yet all this poses so many philosophical questions that it might be time to think about these - if you believe any of it.
I myself have some doubt of both theories. With peak oil it seems that the first signs are indeed there and yes I imagine it might get a little bumpy but the society is already transitioning - even if very slowly - and migrating towards alternative energy sources and alternative productivity sources - away from oil. I guess there might be a transitional period where a little chaos could break out (maybe five years period) but in general it should be a position that stabilizes and seeing so many non-western culture countries still getting along just fine without the vast amounts of oil the west consumes I guess it might be for the benefit of the earth if that shake out affects the overconsuming overgrowing western world that has everything it desires.
As with Singularity - I just don´t know. As much as I like the theories behind it I also tend to look at history to see into the future for my amusement and I do see that in any period of intense growth people were overevaluating technologic accomplishments. Not only in war (romans vs. germans for example) but also in theory (flying saucers, robots and yes Artificial Intelligence itself when looked at from the standpoint of the 60s we should have computers talking to us intelligently for at least 20-25 years now). So yes computer power is growing and so we are advancing in nanotechnology, biology, quantum computing and all the other jazz out there - but are we really capable of developing something that will be more intelligent then ourselfs, better faster more precise without errors that still has the amount of conceptional and creative thinking that is so human and has lead us here in the first place? Will raw computing power and artificial neuronal networks really capture were we wanna go? I do see that we will have some great technological products in the future but I do have some serious doubts about singularity, brain upload/enhancements. I do see robots taking over most noncreative jobs and helping us out in everyday live. I see people living considerably longer then they are supposed to (maybe one day infinitely if they choose so), but I do not see a convergent point this all happens - and yes cellphone got much smaller as well as iPods, beamers have nanotechnology in them and and and - but if I look at my everyday computer life the intelligence inside the machine has not changed a single bit since I got my first one - they are just a little faster - sometimes I think the software I use is even dumber then when it was simple. My theory is that to make something that is better then we are - something that can replicate itself and think on some abstract levels beyond us it might be too complicated for the human brain - even in a collective whole - to generate in the first place - I of course liked to be proofed wrong. The idea of singularity to safe the world is stunningly beautiful - especially if it could lead to such vast technological advancements as space travel for everyone in my lifetime - I just have the gut feeling that we hit a roadblock somewhere -physical, mentally or theoretically. Out of all Singularity prospects I would go with the Artificial Intelligence Once if I could choose - as I do not trust Nanobots or strange biological updated humans - with plain good old hard/software combo and a lot of silicon you might have the option to change a routine after you started the process.
Of course there is also the possibility that we nuke ourself out of existence before anything happens.

4.11.05

mechanical turk - the artificial artificial intelligence

turk.jpgI read this first on golem.de and thought its a bad joke but no no - apparently Amazon wants artificial intelligence for their services really really really badly but the AI available is not good enough for them so they resort to a trick that has been used before - put a human in a machine and call it AI. The back then invention was the Chess - Turk. A machine dressed as a Turk that could play chess better then any other human on earth - later it became clear that there must have been a real human - small in size - been sitting in there. Amazon is basically doing the same thing with their MechTurk but on a grander scale - they want to make hords of jobless people their slaves by letting them solve simple puzzles - that are too complex for the machines to solve - like writing down the contents of a picture (tagging the web in effect?) or movies or - well if you are a developer and have an idea of what millions of jobless untrained real flesh drones could do for you then you can tap in with their API. In the end amazon wants to use that data to train an AI database and make all the jobless people jobless again - and many more…

Update: The Thesis behind it.

6.06.05

Its true! Apple goes to Intel x86 processors....

PPCvsIntel.pngPPCvsIntelAE.pngPPCvsIntelLightwave.png I can´t really describe my feeling about this. Just yesterday I wasn´t sure anymore if Steve wouldn´t do it. I didn´t call the rumor completely off this time even though I hated most thoughts about it. Apple moves their complete platform over to Intel processors and starts selling the first machines on 06/06/06 (that is June 6th 2006). That means if you are having an Apple computer and was thinking about selling it to replace it by a new PPC based Apple computer you are out of luck for two reasons. No more PPC based Apple hardware after 2007 and your old computer just lost about 50% of its resale price. It also means most of those wanting/needing a new laptop or a new Desktop will probably want to wait until this switch has gone over - just to have a box that runs 95% of all their apps in emulation mode and the rest as 1.0 beta versions for the next 3 years. As said it took Apple and its customers 5 years from 680x0 to PPC it took Apple 5 years at least to switch from OS9 to OSX it will take Apple and its customers at least 5 years to be at the point where we are today - a almost stable OS with TONS of software that is finally also getting more and more stable and optimized for current hardware. It took some developers 8 years to finally implement Altivec what do you think how long it will take small mac only developers - and there are MANY out there - to port their heavily optimized applications to an Intel architecture? I think this is a bad move for all long term Apple customers and mac/ppc only developers it will kill all Mac hardware sales in the next 2 years and it will feel very very bad to be on the mac platform that will just stagnate the next 5 years. Future is far now.... I still have to make up my mind if I want to help apple at all anymore through betatesting program - they burned me alive.
What should I do with the powerbook now? there is no way in hell I am buying a powerbook now - even my old 400Mhz just died and desperately needs replacement. Will I wait until the prices drop for current G4 powerbooks on ebay? Will they actually drop and not even rise because everyone wants a G4 Powerbook for their bookshelf? Will Apple come to my help with that decision? Its fucking 12 month before they start to ship the first boxes... Likely those will be consumer boxes as they are the people who are running the least amount of software. The pros/powerbook will then another 1.5 years off in the future??? Are you kidding me? This company just kills all high profit margin sales for 2 years that is suicide?! Also EVERY other computer maker that has tried to move from 680x0/PPC/Custom (alpha) hardware to Intel has died or has been bought up shortly before bankruptcy or at least lossing all profit. Be, Commodore, NeXT, Sun, SGI are the most prominent examples for that. Apple has OSX and the hardware lockin will probably remain to ensure great compatibility and the "ease of use". The ease of use just has gone out the door: "you have a Intel mac or a PPC mac? What? Your processor? Oh I have an iMac..." All companies especially in the entainment bussiness have to buy ALL of there software anew... There is simply no way to run a video application or a 3d application in emulation mode that would cost you the speed gain the Intel hardware would have brought with it in the first place. Thats a lot of money down the drain for a lot people. I am starting to just bitch around sorry. I am really sad - its an era over for me - 16 years on 680x0/PPC processors in a 19 year computer life is not something you brush off easely. I ALWAYS liked the systems. I never felt "left out" or "on slow hardware" even through the 400Mhz disaster it was all still feeling ok for what I did at the time. I am sad to see these times go - I guess I have to gripe with another change in my life and will probably survive it. I feel sad for all the VJs out there that hoped to get a fullrez capable machine in the short future - I guess we all have to wait 2-3 more years for that. That is why I put this under category "Future is Far" - that is how I feel today.

Picture are taking on June 06 2005 from http://www.apple.com/powermac/performance/

31.07.04

Transportation Futuristics

A very cool website that shows how the past had hoped the future of transportation should look like. Not much time to say more other then VERY COOL :)

http://www.lib.berkeley.edu/news_events/exhibits/futuristics/index.html"

29.05.04

How the past saw the future

Via Slashdot comes a link to a great informative website that shows how people, artists, novel authors etc saw the future at the beginning of the last century. Very detailed website.
http://www.davidszondy.com/future/futurepast.htm

6.12.03

Toner that can be deleted

Yes all those poor little trees that get chopped down because you print something out that had the wrong typeface or that had this thought that you didn´t like will now get a relief. A LaserPrinterToner that can be deleted. The stuff comes out of Toshibas.labs. It consists of the actual toner (and some ink for you inkwell pen) and a thing that looks like a hybrid of an 1980s copy machine and a suite case. You put the pages into this thing and leave it running for 3 hours. After this the pages are empty again. The technology is called e-blue (well the ink/toner is blue). I think it sucks, what about the power you need for this strange decolorizer to run, this will hurt the environment as well. I WANT MY TRUE E-PAPER as promised 6 years ago.