Search results for "crytopgraphy "

The Deep

Author: joe

Saturday, 30 July, 2011 - 21:39

The web has been described as The Shallows, a perspective whose subject position is firm-footed on the far-side of the Styx. The web is deep, The Deep. The early pioneers also mourned the loss of the low-lying fens of the embryonic web as a few inches of idiocy washed over man-made territory.

... every year in September, a large number of new university freshmen acquired access to Usenet for the first time, and took some time to acclimatise to the network's standards of conduct and "netiquette". After a month or so, these new users would theoretically learn to comport themselves according to its conventions, or simply tire of using the service. September thus heralded the peak influx of disruptive newcomers to the network [...]
 
Since that time, the dramatic rise in the popularity of the Internet has brought a constant stream of new users. Thus, from the point of view of the pre-1993 Usenet user, the regular "September" influx of new users never ended. The term was first used by Dave Fischer in a January 26, 1994, post to alt.folklore.computers:
 
"It's moot now. September 1993 will go down in net.history as the September that never ended."

Wikipedia, 16 July 2011, Eternal September, [http://en.wikipedia.org/w/index.php?title=Eternal_September&oldid=439704212]

The strata in the body of the web are veined with magnetic powers - the repulsion of opposites, maintained initially by traditional class-like divides: pioneers vs carpetbaggers, early adopters vs noobs, experts vs laymen, serious cats vs your mum.

Here's one of the secrets they don't tell you when you first whip that modem out of its plastic wrapper and fight your way through arcane commands to log on: cyberspace is full of cliques.
 
Wendy Grossman, March 1997, The Making of an Underclass: AOL, [http://www.nyupress.org/netwars/pages/chapter03/ch03_.html]

The early, wide incursions into the rarified air of the web were not shallow. They were glacial, carving out abysses of information as they terraformed a new geology of media. The new Stygian divide became that between the commercials and professionals (academies, corporations) and auto-didacts (the amateurs that are you and I).

Another interesting point is that the old quality distinction between "authorities" & "experts" on one side and "dedicated individuals" on the other is nowadays slowly disappearing. We could even state -paradoxically and taking account of all due exceptions- that those that study and publish their take on a given matter for money and career purposes (most of those deep web "authoritative experts" and almost all the young sycophants from minor and/or unknown universities that hover around many proprietary databases) will seldom be able to match the knowledge depth (and width) offered by those that work on a given argument out of sheer love and passion.
 
fravia+, 12 February 2008, How to access and exploit the shallow deep web, [http://www.searchlores.org/deepweb_searching.htm]

Foundational, dialectical mythologies arose: the search engines index the useful web; the search engines get co-opted; the PageRank™ algorithm resists gaming; the indexed web becomes mere unwashed popularity. We must maintain the separate layers at all costs.

On the internet, there is no real underground anymore. So if you wanted to create an underground for yourself, the first thing you might do is generate a sort of lexical darknet by using keyterms search engines can’t parse.
 
Warren Ellis, 7 June 2010, †‡† (Cross Doublecross Cross?), [http://www.warrenellis.com/?p=9751]

The ongoing conflict between conservatism and progressivism continues inexorably. Emancipatory drives are reified, while solid old steadfasts melt into air: the veins migrate within the rock, metamorphically, changing state, but the strata always remain, somewhere. There are antinomies that will not be mixed.

This is the paradox of the underground: staying small means not being noticed (widely), but will mean being able to exist for probably an extended period of time. Becoming (too) big will mean reaching more people and spreading the texts further into society, however it will also probably mean being noticed as a treat, as a ‘network of text-piracy’. The true strategy is to retain this balance of openly dispersed subversivity.
 
Janneke Adema, 20 September 2009, Scanners, collectors and aggregators. On the ‘underground movement’ of (pirated) theory text sharing, [http://openreflections.wordpress.com/2009/09/20/scanners-collectors-and-aggregators-on-the-%E2%80%98underground-movement%E2%80%99-of-pirated-theory-text-sharing/]

In the metamorphism there are opportunities, Temporary Autonomous Zones which open up in moments of depressurisation, in deterritorialised spaces, but only for as long as the tectonic attention is driven elsewhere - or until the machine is hard-rebooted.

Island2 is a free software artwork by Martin Howse which creates "a semi-permanent, isolated island in the computer's memory". It's a program that firmly establishes its own space in the memory, unnoticed and inviolable. This empty, silent virtual zone, is only temporary autonomous, since it can be removed simply switching off or restarting the machine. But its "hidden territory" is fascinating, an unknown digital land invisibly established under the user's eyes, with no aim to take over any other system part.
 
Neural.it, 12 March 2010, Island2, a squatted computer memory zone, [http://www.neural.it/art/2010/03/island2_a_squatted_computer_me.phtml]

The destabilisation always forces immiscible elements back into to their ever-moving homogenous conglomerations, flints in expanses of chalk. The factions shield themselves, striving to make their outer edges crystallise impenetrably, cryptographically. You may trace it, but not interpret.

Freenet is effectively a shadow of the web, with its own sites, forums and email services [...] Since Freenet sites don’t sit on servers, but on data stores spread throughout the network, they can’t be taken down, and because each communication between one computer and another is routed through other nodes, with each one only "knowing" the address of the next node and that of the last, Freenet's users can maintain high levels of anonymity.
 
On Freenet, nobody knows who you are, or what you’re looking at. Each system also contributes hard disk space, which is occupied by a data cache containing chunks of heavily encrypted data that the program can reassemble into Freenet forums and sites [...]
 
Freenet was the brainchild of a young Irish computer scientist, Ian Clarke, who came up with the idea during his studies at the University of Edinburgh in the mid-1990s. He wanted to "build a communication tool that would realise the things that a lot of people thought the internet was – a place where you could communicate without being watched, and where people could be anonymous if they wanted to be".
 
PC Pro, 9 March 2010, The dark side of the web, [http://www.pcpro.co.uk/features/356254/the-dark-side-of-the-web]

Data form into islands, inhabited but isolated; incommunicable to all but the secret agent, Charon, whose services cost one silver coin.

wget http://1010.co.uk/island2.tar.gz
 
tar zxvf island2.tar.gz
 
cd island2
 
make
 
insmod ./island.ko

Martin Howse, 5 April 2010, island2, [http://www.1010.co.uk/org/island2.html]

Categories: darknet, deep web, shallows, crytopgraphy, cliques, information,
Comments: 0