« Remote Controlled Spyplane for everyone | Main | The slow death of old new media »

The World between Peak Oil and Singularity

The day when everything changes seems to come closer and closer if you want to believe one of the "world as we know it" ending scenarios. and there are quite a lot of them out there next to Nostradamus and Marsians taking over the two most prominent and believable? Peak Oil and Singularity and in between them selfdestruction.
Peak Oil is gaining momentum in a plethora of on and offline communities - its the scenario that is based on the fact that in a couple of years (people estimate at most 15) the world is running out of oil and therefore its base for prosperous survival and capitalistic growth. A scenario were mankind would suffer the consequences of the overuse of one of the most valuable assets of todays earth by burning it just to move around. No more plastic no more cars no more transportation. The scenario is one that will gradually lead to a new society that either adapts in time on a grand scale and therefore has little impact at all or will disrupt western lifestyle totally and lead us back to a more earth bound life without lots of technological advancements with some superrich people maybe hanging in there extremely long and grabbing the world power.
The other interesting option is Singularity - the event when one or more or all technological researches produce something artificial or biological or subatomic or ? that is better then the brain of todays humans. There are some rivalries if it might be the nanotechnology, the advancing field of nanobiotechnology or artificial intelligence on normal silicon based computers. So an artificial intelligence or a highly upgraded human intelligence will spin the invention wheel faster and faster leading to better and better AI or upgrading or more and more nanocomputobots reinventing itself basically on an exponential curve hindered with only a lack imagination - maybe. This scenario would give us a world we can not comprehend right now with everything at stake - maybe even intergalactic space travel or the like. The timeframe for this interestingly enough was thought to come around 2035 (30 years) but has been gradually reduced in the last years due to the already happening speed up in technology and cross breeding technology to about 2015-2025 so again around 15 years from now - if you are being conservative (some people say this might happen before 2010 - I tend to ignore them).
Besides some institutes and communities rallying around those ideas it seems that the major powers of this world and the masses in general completely ignore those propositions of the future of mankind and take on the course we have been in since our existence - study, conquer, develop - yet all this poses so many philosophical questions that it might be time to think about these - if you believe any of it.
I myself have some doubt of both theories. With peak oil it seems that the first signs are indeed there and yes I imagine it might get a little bumpy but the society is already transitioning - even if very slowly - and migrating towards alternative energy sources and alternative productivity sources - away from oil. I guess there might be a transitional period where a little chaos could break out (maybe five years period) but in general it should be a position that stabilizes and seeing so many non-western culture countries still getting along just fine without the vast amounts of oil the west consumes I guess it might be for the benefit of the earth if that shake out affects the overconsuming overgrowing western world that has everything it desires.
As with Singularity - I just donīt know. As much as I like the theories behind it I also tend to look at history to see into the future for my amusement and I do see that in any period of intense growth people were overevaluating technologic accomplishments. Not only in war (romans vs. germans for example) but also in theory (flying saucers, robots and yes Artificial Intelligence itself when looked at from the standpoint of the 60s we should have computers talking to us intelligently for at least 20-25 years now). So yes computer power is growing and so we are advancing in nanotechnology, biology, quantum computing and all the other jazz out there - but are we really capable of developing something that will be more intelligent then ourselfs, better faster more precise without errors that still has the amount of conceptional and creative thinking that is so human and has lead us here in the first place? Will raw computing power and artificial neuronal networks really capture were we wanna go? I do see that we will have some great technological products in the future but I do have some serious doubts about singularity, brain upload/enhancements. I do see robots taking over most noncreative jobs and helping us out in everyday live. I see people living considerably longer then they are supposed to (maybe one day infinitely if they choose so), but I do not see a convergent point this all happens - and yes cellphone got much smaller as well as iPods, beamers have nanotechnology in them and and and - but if I look at my everyday computer life the intelligence inside the machine has not changed a single bit since I got my first one - they are just a little faster - sometimes I think the software I use is even dumber then when it was simple. My theory is that to make something that is better then we are - something that can replicate itself and think on some abstract levels beyond us it might be too complicated for the human brain - even in a collective whole - to generate in the first place - I of course liked to be proofed wrong. The idea of singularity to safe the world is stunningly beautiful - especially if it could lead to such vast technological advancements as space travel for everyone in my lifetime - I just have the gut feeling that we hit a roadblock somewhere -physical, mentally or theoretically. Out of all Singularity prospects I would go with the Artificial Intelligence Once if I could choose - as I do not trust Nanobots or strange biological updated humans - with plain good old hard/software combo and a lot of silicon you might have the option to change a routine after you started the process.
Of course there is also the possibility that we nuke ourself out of existence before anything happens.

TrackBack

TrackBack URL for this entry:
http://prototypen.com/cgi-bin/mt/mt-tb.cgi/1392

Post a comment