SlideShare a Scribd company logo
1 of 102
Download to read offline
“A system that works for me.”




                          - an antropological fieldwork
                        in a computer operating system
                                    Andreas Lloyd 2009
My interest in the Ubuntu Linux system began in October 2004 as I was browsing on the Internet looking for
a new laptop computer. I noticed that some shops offered to sell me a computer with no operating system. I
decided that I could save money by installing Linux, and after some trial-and-error, I managed to install
Ubuntu.
I was happy to find it working, but unnerved at the newness of it all: the file system was full of opaque
folder names, changing a setting required typing esoteric commands, and whenever I started up or shut
down the computer, it spouted hundreds of lines of technical messages like those featured above.
If computers have souls, then the operating system is the eye through which the soul of the machine is
reflected. And what and how much it reflects, is wholly up to the designers of the operating system. As I got
used to the system, I became more and more intrigued with the transparency of it: All the technical details
were laid bare before me, revealing the immense complexity of the operating system and the effort and
technical knowledge that had gone into its making – all of which I had simply taken for granted until now,
and which now offered itself for interpretation – if I knew how.
All of this showcased technical expertise hinting at the huge software engineering effort stood in sharp
contrast to the glossy login screen that followed the boot sequence.

I soon learned that the transparency within the Ubuntu stemmed from its being open source software.
Open source refers to the source code – which is what the programmer writes. And its being open
signifies that it is available for anybody to edit and improve - if they know how.

But though I had all of the blueprints, all the technical details available to me in the form of these
millions of lines of open source code, I did not have the esoteric engineering knowledge necessary to
appreciate nor modify it. So instead, I turned my interest to the people who did, curious to learn more
about how they collaborate to build such an intricate system.
In April 2006 I initiated anthropological fieldwork following the development cycle of a new version of
Ubuntu, contributing and learning among the Ubuntu developers. I spent 6 months participating in the
development of Ubuntu, including a complete Ubuntu development cycle from June to November 2006.

I sought to integrate the on-line and in-person aspects of my fieldwork as closely as possible, and
following my first in-person meeting with the Ubuntu hackers at the Ubuntu Developer Summit in Paris
in late June 2006, I sent out a fieldwork visit request to 22 of the Ubuntu developers I had met there.
Between August and November, I gradually spent more and more time visiting the hackers in their own
working environments, participating in their everyday life and interviewing them, as well as taking part
in the interactions of the whole community on-line.
That’s me!




           Ubuntu Developer Summit, Paris 2006
This is the group photo from the Ubuntu Developer Summit in Paris 2006. It is conferences like these, where
all of the Ubuntu developers came from all over the world to spend a week together that gave me the sense
that this was a proper anthropological setting. A digital village emerging in a Paris airport hotel for a week of
hacking, discussing and socialising.
Andreas Lloyd:
  “A system that works for me”
 - an anthropological analysis of
  computer hackers’ shared use
     and development of the
      Ubuntu Linux System
             (2007)




A big part of the resulting thesis focused on examining the Ubuntu hackers' day-to-day practices, and the
main argument of the thesis is that the Ubuntu hackers’ shared use and development of the Ubuntu system
constitutes a community of practice around their collaborative work and commitment to the project.

By positing the Ubuntu community as a community of practice, I explore how the Ubuntu hackers are using
new technical and social means to manage and share knowledge and skills on-line, and how these means of
learning and sharing are reflected in the system itself.
1. How has Ubuntu come about?

             2. How is Ubuntu being developed?




                            (anthropologically speaking, that is)
Now, Ubuntu as a community of practice is very interesting. But most of all, that is not the parts that I’ll be
focusing on in this talk. Rather, I’ll be talking about two things:
One: The cultural history of the Ubuntu system and how it has come about.
Two: How Ubuntu is being developed through the shared practices of the Ubuntu developers.
- and mind you, this is an anthropological study. Not an attempt at software history or a computer scientific
exploration of open source development.
1. How has Ubuntu come about?




So how did the Ubuntu system come
about?
Ubuntu is Free Software




Well, the first thing to realize about Ubuntu is that it is Free Software. What does that mean exactly? Well, it
is much the same as open source software though the differences are subject to a prolonged discussion
among the proponents of each term.
Christopher Kelty:
         Two Bits
- the cultural significance
     of Free Software
          (2008)




I use the term Free Software because that is what the Ubuntu hackers call their work. And because it makes
it easier to link to this great book by Christopher Kelty, which explores the cultural history of Free Software.
I have a chapter on the cultural history of Ubuntu in my thesis, and I must admit that Kelty’s work is much
more thorough, detailed and well-argumented than mine. So I’m happy to use his work here.
Five core practices of Free Software




In his book, Kelty identifies 5 core practices of free software. All of these practices are instrumental in the
development and dissemination of the Ubuntu system, so I’ll walk through these in turn, using them as a
way to illustrate the cultural history of Ubuntu.
Sharing source code




Sharing source code is the most basic practice of free software. In order for the other practices to make
sense, this first one must be in place.
Dennis Ritchie & Kenneth Thompson
                          Unix, 1969
Sharing source code is something hackers and software developers always have done. One of the primary
joys of writing good software is being able to share it with peers, win their recognition and even have them
use the software themselves.
When Dennis Ritchie and Ken Thompson wrote the Unix operating system in the late 1960s and early 1970s
(the architecture of which Linux and Ubuntu still uses today), they naturally wanted to share the source code
with anybody who might be interested.
Due to some historic coincidence, Ritchie and Thompson developed Unix while working at Bell Labs, a
research department of AT&T. At that time, AT&T had a monopoly status on the American telecom industry
and was not allowed to enter competing fields. Thus, the company couldn’t really make money off Unix. And
so, Ritchie and Thompson could spread tapes with copies of the operating system to peers at universities
and other companies all around the world without much expenditure. By the early 1980s Unix was the
preferred system on university and business computers alike.
“     ...they were living bodies of narrative that
  many people knew by heart, and told over
  and over again – making their own personal
  embellishments whenever it struck their fancy.
  The bad embellishments were shouted down,
  the good ones picked up by others, polished
  and improved and, over time, incorporated
  into the story.
                                                - Neal Stephenson
                                        In the beginning was the command line (1999)

This open sharing helped Unix grow and improve, as other developers began to develop new software and
improve the existing software for Unix. As the science fiction writer Neal Stephenson describes it, Unix in the
1970s and early 1980s became something akin to the Gilgamesh epic: An “oral tradition” - a living body of
software narrative that was continually and cumulatively improved with each new version.
Conceptualizing open systems




Another core practice of free software is the conceptualization of open systems. What this means is
conceptualizing and building systems in such a way that everybody are free to use and implement these
systems without approval of some formal institution.
Vint Cerf, Jon Postel, Stephen Crocker
                        TCP/IP, 1975
The classic example of such an open system is the TCP/IP protocol, which was developed by the Internet
Engineering Task Force - including these 3 distinguished gentlemen - from the mid-1970s. By publishing all
of the specifications and having open discussions on the nature of the systems, they helped ensure its
openness.
And despite the fact that another networking standard had already received approval as an ISO standar (not
an easy thing to do), TCP/IP became the Internet networking standard simply because it was already in use,
and it was easy to adopt and adapt as needed.
“       We reject kings, presidents and voting.
     We believe in rough consensus and running
     code.



                                           - Dave Clark
                                           Internet Engineering Task Force

Dave Clark of the IETF sums up their position quite well: They don’t believe in a standard just because some
standards body approve it. They believe in a standard that is the de facto standard because everybody is
already using it.
Other examples of open standards - most of which haven’t been as spectacularly successful as TCP/IP
include the Open Document Format (ODF), the Ogg Vorbis audio format (OGG) and the HTTP protocol of
the world wide web.
Writing Licenses




If you share your code or conceive of open systems, how can you ensure that it remains free and open? You
need tailor-fitted software licenses for this.
Richard Stallman
                                       GNU, 1984
In 1984, AT&T had lost its monopoly status and was broken into a number of smaller companies, one of
which could now turn Unix into a cash cow with exorbitant license fees.
A hacker at MIT, Richard Stallman found it morally intolerable for an owner of software to say “'I won't let
you understand how this works; I'm going to keep you helplessly dependent on me and if you share with
your friends, I'll call you a pirate and put you in jail.'” So 1984, he quit his job at MIT in order to devote
himself to what he called free software – software that anyone can freely could study, use, modify and
redistribute in any manner they please as long as they also freely redistribute their own changes to the
source code.
He decided that the best way to advance this agenda would be to write a completely free version of Unix,
which he, in a typical case of hacker wit, called GNU – a recursive acronym for “GNU's Not Unix.” And
building on the same “oral tradition” described by Stephenson, interested hackers soon began to send him
improvements and suggestions, which he could incorporate in new releases of the GNU programs.
“     The licenses for most software and
   other practical works are designed to take
   away your freedom to share and change the
   works. By contrast, the GNU General Public
   License is intended to guarantee your
   freedom to share and change all versions of
   a program--to make sure it remains free
   software for all its users.

                                       - Richard Stallman
                                            Preamble, GNU General Public License

As the project grew, Stallman created the GNU General Public License (GPL) to legally ensure that the GNU
programs remained free software. The GPL ensured that users could not modify the rights granted by the
GPL to any user, meaning that any GPL software even if improved and combined with non-GPL software
could only be redistributed on GPL terms.
The American anthropologist Gabriella Coleman points out that with the GPL Stallman cleverly hacked
traditional copyright law by inverting it: It was not a question of enforcing an authorial monopoly, but to
ensure that his software was freely available and impossible to monopolize, in effect coercing all users of his
software to give away their own improvements.
Coordinating Collaboration




Though Stallman distributed his GNU software on the Internet, he did not use it to open up for greater
collaboration, which is probably the core practice for which Free Software is best known today.
Linus Torvalds
                                      Linux, 1991
In 1991, Linus Torvalds, a young Finnish Computer Science student frustrated at waiting in line at the
university Unix terminals, wanted to run Unix on his new PC at home. All he needed to complete the GNU
system he had downloaded off the internet was a kernel – the core program which allocates resources to the
various applications and users present on the system. He began writing such a kernel over the summer
holidays that year. Since Finland had invested heavily in telecommunications infrastructure, he was one of
the first to have Internet access from home, and on October 5th, he announced his kernel project on the
Internet:
“      This is a program for hackers by a
   hacker. I've enjoyed doing it, and somebody
   might enjoy looking at it and even modifying
   it for their own needs. It is still small enough
   to understand, use and modify, and I'm
   looking forward to any comments you might
   have.

                                      - Linus Torvalds
                                           Announcing Linux on Usenet, 1991

What was remarkable about Linux was that Torvalds embraced the fact that the Internet allows people from
all over the world to collaborate instantly whereas earlier, such collaboration had been limited to the slow
and random “oral history” exchange of tapes and disks. He used the Internet to constantly release new,
rough versions of the code for others to test simply by making it available for download on-line,
incorporating patches – code improvements – and suggestions which other hackers sent to him, discussing
designs and plans on mailing lists and giving them the opportunity to add their own changes as they proved
themselves technically capable.
Fomenting Movements




The final core practice of free software is fomenting movements. Again, this was to a great extent
developed by Richard Stallman as he fought for software freedom in the early 1980s, but it really come to
the fore with the rise of Linux distributions. Throughout the 1990s, it was still a formidable task to get Linux
installed and functional on ones computer, as it involved configuring and maintaining everything yourself. To
make it easier to install and run Linux, distributions of Linux began to appear, integrating much of the new
software with GNU/Linux for easier installation and maintenance.
Ian Murdock
                                       Debian, 1993
Initiated in 1993 by Ian Murdock, Debian is one of the great Linux distributions, seeking to keep a complete
catalog of all the free software available online.
“       We will make the best system we can,
    so that free works will be widely distributed
    and used.




                                  - Debian Social Contract

But unlike many of the other Linux distributions, which were owned and developed by companies, Debian
was a community-driven effort with hundreds of developers gathering under the promise of the Debian
Social Contract to “make the best system we can, so that free works will be widely distributed and used.”

Debian is not merely a system or a distribution of software. It is a movement dedicated to distribution and
dissemination of free software.
This is a genealogy of sorts of the various Linux distributions in existence by 2007.
As one of the old Linux distributions, Debian has spawned a wide number of other Linux distributions, which
base their efforts off Debian’s codebase.
Ubuntu is one such heir, branching out of the main line of Debian to produce a system with a somewhat
differing set of goals than the main Debian system.
Mark Shuttleworth
                                 Ubuntu, 2004
In 2004, former Debian developer - and recent dot-com IT millionaire - Mark Shuttleworth hired 12
prominent free software developers, among them many of the most active Debian developers to work on a
new Linux distribution.
“     Microsoft has a majority market share in
   the new desktop PC marketplace. This is a
   bug, which Ubuntu is designed to fix.




                                         - Ubuntu bug no 1
                                               Filed by Mark Shuttleworth

Thankful for all the free software tools that had made his success possible, Mark Shuttleworth wanted to
give back to the free software communities, and having seen how proprietary software like Microsoft
Windows was forcing countries like his native South Africa to spend millions on software licenses, and in
disappointment at unwillingness of the commercial Linux distributions to challenge Microsoft for more than a
minimal market share, Shuttleworth made it his lofty goal to break Microsoft's majority market share of PC
operating systems, as he later stated in the first bug report to be entered into the Ubuntu bug tracking
system.
“      Non-free software is holding back
    innovation in the IT industry, restricting
    access to IT to a small part of the world's
    population, and limiting the ability of software
    developers to reach their full potential.


                                         - Ubuntu bug no 1
                                               Filed by Mark Shuttleworth

By casting Microsoft's monopoly on desktop PC operating systems as a bug to be fixed, Shuttleworth defines
proprietary software as active impediments to software innovation. By positioning Ubuntu as the solution to
this bug, Shuttleworth put the core practices of Free Software at the centre of software innovation,
reflecting this in the name Ubuntu – an ancient Zulu word meaning “humanity towards others” whose
associated ideology of shared humanity provides a humanistic basis for the free software movement that he
is championing.
Sharing source code

               Conceptualizing open systems

               Writing licenses

               Coordinating collaboration

               Fomenting movements


So, to sum up: The Ubuntu system comes from these five core practices, and it is continuing the
development of these practices into the 21st century.
2. How is Ubuntu being developed?




So let’s look just how they’re doing that. But in order to so, perhaps it’s necessary to get an idea of just how
big the Ubuntu system is.
The Ubuntu system is an abstract thing, consisting of thousands of lines of source code. But computer
source code has a unique quality of both being abstract information as well as being able to have a concrete
effect in the world. Software developers are often called software engineers, and for good reason. A good
way to think of the Ubuntu system is as a huge engineering effort of accumulated complexity managed and
continually developed by its users.
Empire State
                                                                            Building

                                                                             Ca. 3500
                                                                             man years




Let’s look at some other engineering works of some size and complexity... like the Empire State Building
The Panama Canal

                                                                          Ca. 10.000
                                                                          man years




Let’s look at some other engineering works of some size and complexity... like the Panama Canal
Debian in 2005

                                                                    230.000.000
                                                                    lines of code

                                                                       estimated
                                                                        60.000
                                                                       man years



In 2005, a group of Spanish computer scientists counted the amount of code in the whole Debian system. It
resulted in a total of around 230.000.000 lines of code. Using an industry standard of estimating
programming labor, they estimate that the Debian system at that point was the result of 60.000 man years.
All of this work, obviously having taking place in the previous 20 years or so.
Some might object that this is comparing apples and oranges, and it would be more interesting to compare
the Debian system to an intellectual work such as an encyclopedia or a library. I’d agree, but for some
reason, buildings and canals are just a lot more impressive: It’s more than six times the effort of digging
and constructing a channel between two of the biggest oceans in the world. That’s BIG.
“      We build our computers the way we
   build our cities -- over time, without a plan,
   on top of ruins.
                                                 - Ellen Ullman
                                                 Close to the machine (1997)
In fact, when you think about it, the only other works of human engineering of comparable size are cities.
And as the programmer-turned-writer Ellen Ullman puts it, we build our computers - and our computer
systems - in much the same way as we build our cities.
Ubuntu consists of more than
   20.000 software packages.




And indeed, Ubuntu, inheriting the modular design of Debian, is constructed much like a city: Consisting of
more than 18.000 software packages, some big, some small, and all of them part of an intricate web of
complex dependencies. Much like a carefully assembled model: Some parts needs to be in place for others
to work, and some parts depend on function supplied by packages on which they aren’t even directly
dependent.Much like a city and its interconnected web of services and infrastructure.
Upstream
                                    Free software projects




                                                     Debian unstable



                                                                                Ubuntu release




Downstream                                                       User’s system
Each software package in Ubuntu is a piece of software developed outside of Ubuntu by other developers in
separate free software projects. Since the developers involved with these projects rarely are directly involved
with Ubuntu, they are often called the upstream for distributions like Debian and Ubuntu. Since Ubuntu
depends on the Debian development version known as Debian Unstable to gather the latest versions of all
of the free software in the whole free software eco-system, Debian is also the upstream for Ubuntu, which
in turn is the upstream for the actual users

In this way, a flow of new software packages is flow of the latest changes from the various free
software projects into Debian and from there into Ubuntu. Most of the work of the Ubuntu developers
thus consists of packaging and integrating this new software with the existing Ubuntu system.
What’s in a software package? Well, and easy example would be an application like the web browser Firefox.
Or free software photoshop known as
GIMP.
Or it could be something more fundamental like a programming language like Python, which other software
packages written in Python depend on.
“     ... organizations which design systems...
 are constrained to produce designs which are
 copies of the communication structures of
 these organizations.



                                               - Melvin Conway
                                               How Do Committees invent? (1968)

The modular design of the Ubuntu system is a remnant from Debian, where each software package has its
own maintainer, who has complete control over his or her package. Though Ubuntu does not have
designated package maintainers in this way, the communication lines are much the same: A distributed
system where hundreds of developers can work on separate packages without having to change the rest of
the system (well, most of the time, at least).
This confirms Conway’s Law that organisations designing software systems produce designs that copy the
communication structures of that organisation.
Let’s look into how that works.
Whenever you install Ubuntu on your computer, the first thing the system does is to check whether there
have been further changes to the system by checking the online software package repository. It then
proceeds to download the corresponding changes and updating your system.
Whenever the Ubuntu developers fix a bug or implement a new feature in a software package, they upload
a new version of that package containing those changes to the main Ubuntu archive on-line, against which
all on-line Ubuntu systems running the same version of Ubuntu are synchronized through Internet updates.
This not only makes it very easy to distribute fixes as they are produced, but also to introduce unexpected
problems if these updates aren't thoroughly tested.
Stable version 1                      Stable version 2               Development version




In order to be able to extend the Ubuntu system without risking to break the system for their users through
an ill-conceived software update, the Ubuntu developers work on at least three active versions of Ubuntu at
any given time: Two stable versions where the Ubuntu developers only update them as new security holes
and integral weaknesses are exposed. And one version which is the current development version which,
which only developers use.
The Ubuntu developers themselves almost always use the development version of the Ubuntu system for all
of their work. The development version contains the latest unstable software derived from Debian, and the
various upstream free software developers, of whose software both Debian and Ubuntu is comprised.
The main development work of the Ubuntu developers is to gradually stabilize more and more of the system,
testing and ensuring its quality in time for release.
Indeed, one of the main differences between the Ubuntu system and Debian is the pace of software
releases. Where it typically takes Debian 18-23 months to release a new stable version, Ubuntu releases a
new version every 6 months. As mentioned before, both Debian and Ubuntu releases are based on the
permanent development version of Debian called Debian Unstable, which is a snapshot of the latest versions
of all the free software in the free software ecosystem.
“       The first thing to realize is that the
      outside world ran on releases of UNIX (V4,
      V5, V6, V7) but we did not.

      Our view was a continuum.


                                                       - Ken Thompson
In this way, Debian Unstable is remarkably similar to the original Unix system at AT&T. Their system was
always in development. Releases was only something they did when they had to pass the software on to
people outside AT&T.
“       V5 was simply what we had at some
      point in time and was probably put out of
      date simply by the activity required to put
      it in shape to export.



                                                         - Ken Thompson
And as the development of Ubuntu and Debian shows, the effort it takes to release a stable version of the
system is so big that it is out of date by the time it has been released, as new software continues to trickle
down from the free software projects upstream.
Thus, to the Ubuntu developers, the Ubuntu system looks a bit like the Beijing skyline: A place of constant
development. Whenever a building is finished, some other part of town is ready to be sanitized or even
replaced completely.
This perpetual state of development can be illustrated through an examination of the Ubuntu release cycle,
which is ... well, cyclical and somewhat circular.
Each new version begins life with merging the old version with all of the new software from Debian
Unstable, added with the new features specifically designed and developed for that specific version of
Ubuntu. All of this very unstable software is then slowly frozen (not open to new feature changes),
improved, adapted and stabilised, tested, and further stabilised until the whole system is ready for release.
And shortly thereafter, the whole cycle begins anew.
“     ... users know better than operating
     system designers what their own needs are.




                                                        - Eric Raymond
So how is Ubuntu developed? Well, even though Ubuntu aims at making Linux and free software more
accessible for non-technical users, much of the development effort still follows the notion from free software
evangelist Eric Raymond that “... users know better than operating system designers what their own needs
are.”
He said it of Unix, but much the same is true of Ubuntu. The Ubuntu system is primarily developed to match
the needs of the users who are capable of developing it: The Ubuntu developers themselves. Eric Raymond
also said that every good piece of software starts with a developer scratching his own, personal itch. These
itches are indeed contracted through the developers’ own use of the system.
“     ... users know better than operating
     system designers what their own needs are.




                                                        - Eric Raymond
That said, the Ubuntu developers do attempt to work with interaction designers and usability experts to
improve the system’s user interfaces in general. But such design and planning is not core to the way they
develop the system, as we shall see.
They also make a keen effort to open up development work and make it as easy and accessible as possible
for newcomers to get involved scratching itches of their own. This effort is also very fascinating, but it
relates directly to my analysis of Ubuntu as a community of practice, which is another, long story of its own.
So, how do the ubuntu hackers use the Ubuntu system?
Their work is a balanced duality between focusing intensely on their programming work and constantly
scanning the periphery for new information to relate to. This duality is summed up quite nicely by the way
Ubuntu hacker Martin has organized his workspace.
He has set up two IRC clients monitoring the channels he is most interested in, and on top of them, he has
placed his main tool, the command line terminal in such a way that the last 10 lines of conversation are
displayed from the IRC clients, allowing him effortless participation in the constant flow of community
chatter as part of his work on the Ubuntu system.
Martin uses the command line for most of his work. It gives him a clear and transparent means of
interacting with the computer through the concise and extensible commands he inputs, it also allows him to
spread his work beyond his own machines, as he can use it to log on to computers related to other free
software projects around the world or to his own server halfway across the country, with the abstract and
transparent textuality of the command line offering him the same level of control of those remote systems as
if he were there in person.
When programming and working intensely on the computer, hackers often enter a state of uninterrupted
concentration where he loses himself from the outside world, focusing solely on the computer to the point
where the interaction with the computer happens effortlessly, allowing him to immerse himself in the code to
the point where "You don't need to think about the variable names, they will just appear in your mind when
you need them," as one Ubuntu hacker puts it.
This state is very similar to what the American psychologist Mihali Csikszentmihalyi calls flow: “the state in
which people are so involved in an activity that nothing else seems to matter.” It is worth noting how such
diverse activities as programming, playing computer games or writing essays all depend on the computer's
continuous feedback and need for interaction, its minimal requirement of physical exertion, and its huge
mindful potential to induce and maintain the flow state for hours on end.
The flow state requires complete concentration and control, pooling all your attention in a tunnel-like vision
focused on the screen, making you willfully unaware of your surroundings, including the physical actions you
perform to interact with the computer, empowering and guiding you only towards the work on the screen.
“       ... the re-arrangement of tiles on the
   tray is not part of action, it is part of thought

                                                  - Clark & Chalmers
In this way, the interaction with the computer is similar to the American philosophers Clark & Chalmers'
notion of “active externalism” through which they argue that certain mental tasks depend on physical
objects to be solved efficiently. They give the example of the letter tiles in a Scrabble game where we
depend on physically moving the tiles in order to win new perspectives on the possible words to be spelled
rather than on some internal process on our own. Thus, “the re-arrangement of tiles on the tray is not part
of action; it is part of thought.” (Clark & Chalmers 1998:7; emphasis original). In this way, the computer and
the system through which it works becomes an extension of the hacker's mind, it is only through constant,
uninterrupted interaction and focus on the computer that he can solve the task at hand.
Although this mode of working seems highly individual, there is a constant flurry of interaction taking place
through the computer. Each developer attends IRC channels and subscribes to mailing lists reflecting their
different interests around the community. Most of these discussions are essentially long question and answer
sessions, as hackers turn to fellow hackers as well as manuals, web searches and documentation to find the
answers necessary to retain their mastery of the computer.
The inability to be interrupted makes hackers somewhat asynchronous to one another – at least in the short
term. This is reflected clearly by the fact that all of the Ubuntu hackers' preferred on-line communicative
means are textual and thus – at least to some extent – asynchronous. Email, newsgroups and web forums
postings and bug tracker comments are all based on its users reading and replying asynchronously. Even
real time communications such as IRC chat channels and Instant Messaging bend to this rule as developers
“ping” each other, and if there's no immediate response, they can ask their question and let the other part
answer when he has time or attention to spare.
...
        09:00    carlos pitti: ping
        [...]
        09:07    pitti      carlos: pong
        09:08    carlos pitti: I did a mistake yesterday night and latest Edgy export
        has the plural form bug (bug #2322)
        09:08    Ubugtu Malone bug 2322 in rosetta "Truncated plural
        forms" [Critical,In progress] http://launchpad.net/bugs/2322
        09:09    carlos pitti: I'm exporting a new version with that fixed, but it would
        take around 2-3 hours
        09:09    carlos am I late to have it in the prerelease version?
        09:09    pitti      carlos: ah, then I'll rebuild the edgy packs this afternoon
        09:09    pitti      carlos: it won't go into RC anyway
        09:09    carlos ok
        09:09    pitti      carlos: the plan is to upload the final packs tomorrow morning
        09:10    pitti      carlos: thus I'd like to have today's in perfect shape
        09:10    carlos I see
        09:10    carlos ok
        09:10    carlos pitti: I will ping you when the new version is available




Here, Carlos needs to notify Pitti of a new bug which he needs to take into consideration when building a
group of packages for upload. Since Pitti is busy, the conversation doesn't continue until Pitti is able to
respond and they can coordinate their work. Most of the Ubuntu hackers' day-to-day interaction takes place
on IRC where they can pick up on interesting discussions and be available if someone needs to ask a
question.
The hackers deftly navigate back and forth between conversations, fluidly participating as the IRC client
automatically notifies them when someone “pings” them or even just mentions their on-line moniker. And
even if they miss something, they can always go back to check the chat logs or mail archives as all
interactions within the community are recorded and publically archived on-line.
“       ... people are not always in the
      public, but they are always ready to be.



                                            - Christopher Kelty
                                                     Geeks, Recursive Publics
                                                     and Social Imaginaries (2005)

As media theorists Geert Lovink and Ned Rossiter have noted, on-line life is defined by passive consumption
of information, as it would be impossible to relate and reply to all the information that passes by. This can
literally be seen in the mailing lists and the IRC channels which the individual Ubuntu hackers subscribe to.
Most of these they are interested in, but not so deeply to participate actively and regularly – unless a topic
comes up which they care deeply about. Instead, they spend most of their time and energy on relatively few
mailing lists and IRC channels, where they participate much more actively as they are well acquainted with
the people there, and where they contribute with most of the discussion, joking and socialising to which
others can relate passively. As Chris Kelty remarks “people are not always in the public, but they are always
ready to be.”
“I got a system that works for me.”




This fine-tuning of the system to the needs of the individual hacker was reflected in the Ubuntu hackers’
answers to the question “What do you get out of developing Ubuntu?” Almost all of them answered: “I get a
system that works for me.”
A system that not only supports their individual work, but also their shared exchanges online. Thus, the
hacking practices of Ubuntu hackers are not the work of lone engineers, but rather a complex interplay
between hacking work and social interaction on-line (as well as in-person).
Howard Rheingold:
        Tools for thought
             (1985)




This use of the computer is not all that unexpected. As Howard Rheingold’s 1985 history of the computer,
remarkably titled “Tools for thought” explains, a number of the people involved in developing the computer
as we know it today wanted to develop a tool for thought - a machine that could support and help man deal
with the increasing complexity and speed of the modern world.
“      By ‘augmenting human intellect’ we
      mean increasing the capability of a man to
      approach a complex problem situation, to
      gain comprehension to suit his particular
      needs, and to derive solutions to problems.


                                            - Douglas Engelbart
                                                Augmenting Human Intellect (1962)


One of the leading visionaries in developing the computer as a tool for human thought was Douglas
Engelbart, whose project to “augment human intellect” focused on the computer. He was one of the first to
realize that the computer not only allowed for the individual computer user to handle complexity on his own,
but also to collaborate with other users, thus augmenting their collective intellect further.
Engelbart was a pioneer of online collaboration, creating the first proper interactive collaborative setup in
1968. In many ways, the Ubuntu hackers are among his intellectual heirs, as they have internalized his work
to a great extent in their collaborative practices online.
“I got a system that works for me.”




But back to the individual hackers’ use of system: How does their own individual use and adaptation of the
system to their personal needs affect the system as a whole?
“       ...It's like moving from renting a house
    to buying a house – you begin to look at it in
    a different way.

    You begin to notice all the places where there
    is room for improvement.


                                                      - Colin Watson
Well, Ubuntu hacker Colin Watson gave this very interesting follow-up explanation as to what he got out of
working on the Ubuntu system.
“       Part of it is commitment... having and
  taking the responsibility to make things work,
  but it is also having the opportunity to do so.
  You decide for yourself.



                                                         - Colin Watson
Here, Colin sums up what most of the Ubuntu hackers told in some form or another: Working on Ubuntu is
about deciding to make a commitment and taking responsibility for developing the system. Much like any
house owner decides to spend time on maintaining his house. The central point here is that it is not
something that is forced upon them. Spending time maintaining and building the system rather simply just
using it is a hobby, an interest that not only offers new insights, but also new social relationships with other
builders.
Stewart Brand: How Buildings Learn (1994)
Stewart Brand’s book “How Buildings Learn” examines how buildings evolve after being constructed and
after people start living in them. A key concept in that book is that of “shearing layers”. Brand explains how
different parts - layers - of a building evolve at different paces. These are - from slowest to most rapid
change:
1) The site where the house is built - nothing short of major climate change or earthquake will change it.
2) The structure of the house - once set in place, the basic frame of the house is difficult and expensive to
change.
3) The skin of the house - the facade and roof of the house, which is its face to the outside world.
4) The internal services of the house - the water pipes, electricity, gas, broadband and TV cabling and such.
5) The space plan of the house - how the walls and doors and windows are set up.
6) The actual stuff of the inhabitants of the house, the furniture, the books, the cupboards and closets full
of clothes and kitchenware.

Each of these shearing layers has its own flow and changes at its own pace. A house is in constant, albeit
very slow development, adapted to the needs of its inhabitants. I would argue that a computer system like
Ubuntu works in much the same way.
Jonathan Zittrain: The future of the Internet (2008)
I’d argue that a computer system like Ubuntu consists of shearing layers similar to that of a house. This
illustration is from Jonathan Zittrain’s book “The future of the Internet (and how to stop it)”, and shows the
layers of technology which we take for granted in using a computer (right) and the internet (left).
At the bottom is the hardware - the copper and fiber of the wires, the chips of the computers - in the middle
are the open systems - the internet protocol and the operating systems - and at the top are the actual
applications that we use - our everyday stuff.
“        Generativity is a system's capacity
       to produce unanticipated change through
       unfiltered contributions from broad and
       varied audiences.



                                             - Jonathan Zittrain
                                                   The future of the Internet (2008)

The hardware platform and the open systems built on top of them aren’t not controlled by a single entity
like a government or a company, but rather distributed among everybody on the internet and with a PC of
their own.
Zittrain’s core point is that this lack of centralized control makes these systems generative. Generative in the
sense that they allow for anybody - large and varied audiences - to adapt them to their own needs, creating
unanticipated change through unfiltered contributions.
Since the platform - both the PC and the Internet - are open and generative, there are no single entity to
prevent people from building whatever they pleased on top of it. And that is exactly what characterizes free
software like Ubuntu.
“      You're doing your task, or a task you've
      done before, and at some point you wonder
      ‘why do I always need to type five different
      commands to achieve this task which I'm
      doing over and over again?’



                                                       - Martin Pitt
It is the generativity of the Ubuntu system which gives the Ubuntu developers the opportunity to adapt it to
their own needs to the extent that they seek and enjoy.
Martin gives a good example of how he uses this generativity in his daily work: Why does he have to type
the same thing five times every time?
“       And so, some times when you have
     nothing urgent to do, you just think about the
     details and say ‘let's factorize this into a
     function’ or ‘let's factorize this into a shell
     script.’


                                                      - Martin Pitt
And so he adopts the system in a way that closed and sterile systems - like the Facebook platform or the
iPhone can’t allow: He adapts the system to his needs by making a script that turns those five commands
into just one.
It’s like the “desire paths” you see in parks or on university campuses, which are the short cut routes people
actually walk rather than the ones designed by the planners of the park. Taking the opportunity to make
these short cuts yourself based on your own needs and interests rather than being dependent on the ones
made by design.
For Ubuntu developers, the greater extent to which you can match your system to the desire paths of your
mind, the better the system will augment your intellect and help you in your work.
“       The command line is basically like an
   organism that over time you can dynamically
   adapt it to your needs and to your current
   tasks...



                                                   - Martin Pitt
Martin has been building his own desire paths in his system through more than 8 years of constant use,
adding scripts and short-cuts and fitting this tool for thought to his thoughts. This is especially takes place at
the command line level, which is the most flexible and transparent representation of the system, which
allows Martin and other Ubuntu hackers to build a shared repertoire of tools and routines spanning almost all
of their mutual engagement in the project, including programming, managing their emails and
communications, emulating and testing other systems, and accessing other computers on the Internet,
thereby constituting an entire digital landscape which they can adapt for their needs.
“     ... This is what makes it so much fun,
   because you can never do this with GUI
   [Graphical User Interface] applications. I
   mean, it might be easy to learn them, but it
   always take the same time to do, so it is
   hard to factorize common tasks to them.


                                                  - Martin Pitt
This also explains Martin and other Ubuntu developers’ hesistancy to use graphical user interfaces, for
though they make it easier to learn and use, they are much harder to adapt and personalise to your
individual needs.
“      To build is in itself already to dwell ...
    Only if we are capable of dwelling, only then
    can we build.




                                                  - Martin Heidegger
The German philosopher Heidegger argued that we are always dwelling, whenever we use and inhabit our
environment, and it is only by dwelling that our environment can be understood and thus organised in the
mind. Building on this, the British anthropologist Tim Ingold argues that building cannot be understood as a
simple process of realising a pre-existing architectural design through raw materials – it is only through
dwelling, relating to the world one inhabits, that building is possible.
Thus, Ingold, argues, as children grow up in environments furnished by the work of previous generations,
they dwell in a built environment that is shaped through human intentionality, and through that environment
they adopt “specific skills, sensibilities and dispositions” with which they in turn continue build the
environment (Ibid:185-186).
I contend that a similar process of dwelling is at play in the Ubuntu system. Computer systems are like
cities: Layers upon layers of human intentionality that evolve as new inhabitants move in, constantly re-
exploring and redefining its structures and applications, adapting and re-designing them for new purposes,
maintaining them differently as needs change. And the Ubuntu system, and especially the command line at
its heart, stems from the almost 40-year old “oral tradition” of Unix technology being refined and redesigned
for varying needs.
Thus the system, like any other built environment, has been shaped by the activities of several generations
of predecessors whose work is now reflected in the skills, dispositions, and dwelling of the Ubuntu
developers. When they adopt and learn the details of the system, customising it to fit their needs, they
begin to dwell within the system, internalising their work flow and social relations to their shared community
directly in the system itself, turning into an extension of their mind.
Umwelt




      Jacob von Uexküll: A Stroll Through the Worlds of Animals and Men (1957)
In his book “Stroll through the Worlds of Animals and Men”, the Estonian biologist Jakob von Uexküll shows
the manifold inhabitants of the tree: There is the fox building his lair at its roots, the owl perched on high on
its branches, the squirrel foraging nuts in the intricate maze of the treetop, the various insects making their
home in the bark and many others. Each of these animals perceive and act differently in the environment of
the oak tree, and von Uexküll defines the sum of each animal's actions and perceptions of the tree as its
Umwelt. Von Uexküll argues that each animal is unable to detach its consciousness from its Umwelt, and
thus see the tree as a neutral object – they will invariably attach their own activities to it.
The British anthropologist Tim Ingold argues that a house is inhabited in much the same way as von
Uexküll's oak tree – that houses, too, are living organisms, constantly being built and rebuilt through the
dwelling of its many inhabitants – human as well as non-human – shaping their own Umwelten through the
flow of intentional activity.
I suggest that like the house or the oak tree, the Ubuntu system is an incredibly big and complex system
with the potential to cover a constantly increasing number of configurations, uses and needs. Each user of
Ubuntu values different functions of the system and has different motivations for using it, and through this
use, each user comes to adapt the overall, standard system to his own personal work flow and interests,
shaping it into a, personal Umwelt, distinct and separate in use and configuration from that of other users,
drawing upon the thousands of software packages to build out his own nest in which to dwell.
“          It doesn't matter how acquainted you are with a
      particular brand of computer. When you need to borrow
      someone else's, it feels like you're using their toothbrush. The
      instant you turn a friend's computer on, it's there: that strange
      arrangement of familiar parts (why do they do it like that?),
      the whole disorienting logic of a place you thought you knew.
      You kind of recognize it. There's an order here. Then, a
      moment of terror. You are...peering into someone else's mind!



                                                           - Kevin Kelly
                                                                 Out of Control (1994)


This intimate relationship between user and computer gives rise to the anxiety of using somebody elses
computer described by Kevin Kelly. The more entertwined and personalized the system is to a specific user,
the more alien and strange it will seem to everybody else. Using somebody else’s system is indeed like
peering into someone else’s mind.
These Umwelten can be as physically and mentally separate from one another as that of a squirrel might be
to an owl. From high-powered servers offering critical services to hundreds of users to the cheap desktop
PCs being used for games and socialising. Each user adopts the system for his needs, some even
contributing back based on that use, in order to scratch their own itches.
Or like Og who submits Brazilian translations of text in the Ubuntu packages in order to improve Brazilian
language support in Ubuntu.
Or like Sebastian who, wanting a better graphical user interface for his favourite package manager, works to
improve the user interface of Ubuntu’s package management software.
Or like Henrik who, paralyzed from the neck down, thus requiring a specialized “head mouse” and dialing
wand to navigate the system, shaping not only his work flow but also his configuration of the system, works
to improve the accessibility features of Ubuntu to make the system easy to use for disabled users.
Or like Scott, who, unhappy with the boot and reboot times of Ubuntu, works to replace the old Unix Init
system that boots up the system, with his own free software project Upstart, which integrates better with
the rest of the system.
Like Jonathan, who, preferring the KDE desktop environment rather than the GNOME desktop environment
that Ubuntu ships as default, works on Kubuntu - a sister distribution to Ubuntu, sharing almost all of the
basic functionality, but with a different graphical interface and different desktop applications on top of it. It
is all a matter of preference and interest.
Thus, in order to grow, the system is dependent on all of the hackers and users individually dwelling,
sharing and collaboratively building the system into a constantly evolving organism, like all the different
species inhabiting and supporting the eco-system of an oak tree, with each new version of the system
building on top of the previous to reach further, supporting more uses and covering more needs.
Whenever a Ubuntu hacker uploads a change to the system, he notes the change in the system change log
– a central listing in which the developers register all of the changes to the system – and signs it with his
digital signature to associate that change with his on-line identity. Thus, at this system-wide level, the
Ubuntu hackers communicate their work by changing the system directly, thereby letting their actions and
changes to the system tell of themselves.
Stigmergic collaboration




The Australian new media theorist Mark Elliott calls this kind of collaboration and communication directly
through action "Stigmergic collaboration", borrowing a term from French zoologist Pierre-Paul Grassé who
first coined the term in the 1950s to describe the way termites interact by building and maintaining their
nest.
Elliott argues that human stigmergic collaboration has become possible through digital means, as it makes it
possible to sidestep social negotiation, making it instantly possible for hundreds of developers to contribute
and collaborate without having to get acquainted with and maintain relationships with one another.
But since changing the system carries the potential risk of breaking all of the inter-connected Ubuntu
systems, each Ubuntu developer needs to have acquired the formal trust of the community through direct
approval from the technical board in order to be able to actively change the system.
Thus, stigmergic development is driven by an awareness that every change you upload affects the entire
system. Not only an immense responsibility, it is a very empowering position, as it allows you to fix a
problem that has been bothering you, not only for yourself, but for everybody else who might encounter the
same problem.
One example of this is Martin who maintains the software package containing the database PostgreSQL.
“       ... whenever I screwed up and did a bug
  in the new version and then people who
  upgraded this said "hey, you just broke my
  installation" [..] This really got me the feeling
  that everything I do is actually used outside
  there.... and it is important and people are
  relying on it...

                                                       - Martin Pitt
This indirect social connection through the use and development of the system is central to Ubuntu hacker
sociality. Since it is shared and developed among several hundred hackers with access to change the Ubuntu
system, the people it come to depend and trust the developers responsible for the packages they use only
to make beneficial and well-considered changes.
In this way, the Ubuntu hackers maintain a communal regime of mutual accountability, aware of the
responsibility they hold, as well as ready to let others know when their changes break each other’s carefully
configured system.
“       Coordination in Free software
    privileges adaptability over planning.




                                      - Christopher Kelty
                                           Two Bits - the cultural significance
                                           of Free Software (2008)
This mutual accountability built through the hackers’ shared use and development of the system is at the
core of the collaborative practices of free software. As Kelty does well to conclude:
"Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing
any kind of modification; the structure of Free Software coordination actually gives precedence to a
generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or
controlled by a hierarchy of individuals."
I see this pattern very clearly in the day-to-day work of the Ubuntu developers: Sure, there are goals and
planning and governance in their work, but the driving goal of the Ubuntu hackers continues to be to build
"a system that works for me" - a system that matches their personal practices with the computer. A system
that is continually and cumulatively improved through the shared effort of the Ubuntu hackers, each
adapted the default system to his or her own needs, extending and developing it as needed along the way.
“       Debugging, in this perspective is not
     separate from design. Both are part of a
     spectrum of changes and improvements
     whose goals and direction are governed by
     the users and the developers themselves...


                                       - Christopher Kelty
                                           Two Bits - the cultural significance
                                           of Free Software (2008)
The key element in free software development is the ability to see development of software as a spectrum
rather than a series of releases. This implies more than just continuous work on a product; it means seeing
the product itself as something fluid, built out of previous ideas and products and transforming,
differentiating into new ones. Debugging, in this perspective is not separate from design. Both are part of a
spectrum of changes and improvements whose goals and direction are governed by the users and the
developers themselves.
In this way, Free Software is a continuing praxis of "figuring out" - giving up a sense of finality in the
product in order to continually adapt and redesign the system. Kelty argues “that successful free software
projects are “experiments with adaptability - attempts at figuring out - that have worked, to the surprise of
many who have insisted that complexity requires planning and hierarchy.”
A recursive public




This constant self-referential experimentation, “figuring out”, and adaptation of the Ubuntu system results in
what Christopher Kelty calls “a recursive public”:
A recursive public is a public "whose existence (which consists solely in address through discourse) is
possible only through discursive and technical reference to the means of creating this public."
That is to say: The Ubuntu community exists solely to talk about and coordinate the continued development
of the Ubuntu system, which is also the means to develop Ubuntu, which is also the means through which
they are able to discuss and develop Ubuntu. It is a self-referential, recursive loop of discourse.
Ubuntu is recursive through its discourse about technology (the Ubuntu system), which is also the
technological means that make that discourse possible.
Furthermore, this technology consists of many recursively dependent layers of technical infrastructure: The
applications, the operating system, the internet protocols... Indeed, the depth of recursion is determined by
the openness necessary for the project itself.
In this way, Ubuntu and other free software projects are very similar to Douglas Hofstadter’s notion of
strange loops: With each new development cycle, the system and the community goes further than ever
before, figuring out and adapting the system to their needs, only to return to the starting point: The
unstable continuum of software-in-development from whence they started out. A state of permanent
development from which a new version branches out, wiser and better than before. But the system and the
community itself is never finished.
It is a technological ecosystem of sorts: The individual developers depend on the system to augment their
intellect in their daily work. The developers depend on each other to ensure that the system as a whole
continues to function. And the system itself depends on the developers’ continued developing and
maintaining it.
“      Age plus adaptivity is what makes a
    building come to be loved.

    The building learns from its occupants and
    they learn from it.


                                                  - Stewart Brand
                                                       How Buildings Learn (1994)

In his book, “How buildings Learn”, Stewart Brand concludes that “Age plus adaptivity is what makes a
building come to be loved.” - it is the strange loop of shared learning between the various occupants
shaping the building, and the building shaping them in turn that results in good buildings, loved and lived in
over the centuries.
Age plus adaptivity is what makes a system
   come to be loved.

   The system learns from its users and they
   learn from it.




Based on my fieldwork, I would argue that the same could be said of computer systems - it is the strange
loop of shared learning between the various users shaping the system, and the system shaping them in turn
that is at the core of a good computer system. We have lived in houses for hundreds of years, adapting
them to our changing needs. We have only had computer systems in 40 years or so. But already now we
are finding ways and processes of building and maintaining systems together that may in such a way that
they can adapt endlessly to the test of time.
?
This leaves room for a lot of interesting questions and themes for further exploration:
- How will we as computer users change as a result of becoming steadily more dependent on computer
systems like Ubuntu to augment our intellect?
- What role will average users (not developers) uncapable of adapting the details of the system to our
personal needs play in the development of the Ubuntu system?
- How does one become an Ubuntu developer (this question links with my work on Ubuntu as a community
of practice, which I’ll have to do another presentation on at some point)?
- What is the broader cultural significance of the practices of free software development in relation to how
we collaborate and organize ourselves in other contexts?
- If code is law, as Lawrence Lessig and others have stated, how can the development and governance of
operating systems be analogous to, for instance, the development of the democratic constitutions?
- If technology becomes a bigger part of other public spheres - such as public administration and decision
making, will they have to become recursive publics in their own rights?
- How would a lack of finality - the notion of being in a fluid continuum - a state of permanent development
- affect other public spheres?
Thank you!
   email: andreas.lloyd@gmail.com
   weblog: http://www.andreaslloyd.dk
   thesis: http://www.andreaslloyd.dk/thesis
   twitter: @andreaslloyd
                                                       http://www.flickr.com/photos/gareandkitty/237547991
                                                       http://www.flickr.com/photos/stuckincustoms/201034654
                                                       http://www.flickr.com/photos/stuckincustoms/302644832
                                                       http://www.flickr.com/photos/noirin/177599802
                                                       http://www.flickr.com/photos/omar_eduardo/127707517
       Photo Credits:                                  http://www.flickr.com/photos/wallyg/158337907
                                                       http://www.flickr.com/photos/ableman/2171326385
                                                       http://www.flickr.com/photos/wetwebwork/2847766967
                                                       http://www.flickr.com/photos/chanc/1389379381
                                                       http://www.flickr.com/photos/sfllaw/380959080
                                                       http://www.flickr.com/photos/kables/9874584
                                                       http://www.flickr.com/photos/stuckincustoms/217440037
                                                       http://www.flickr.com/photos/z287marc/3189567558
                                                       http://www.flickr.com/photos/75491103@N00/3565601486
                                                       http://www.flickr.com/photos/runnerone/1487082237
I’d love to hear your questions, answers and comments. So if you have any, please get in touch. Thanks for
reading!

More Related Content

What's hot

Internet to web: The 40-year old Internet and the 20-year-old Web
Internet to web:  The 40-year old Internet and the 20-year-old WebInternet to web:  The 40-year old Internet and the 20-year-old Web
Internet to web: The 40-year old Internet and the 20-year-old WebJohan Koren
 
The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]
The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]
The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]William J. Moner
 
Free & Open Source Software (2017 update)
Free & Open Source Software (2017 update)Free & Open Source Software (2017 update)
Free & Open Source Software (2017 update)Frederik Questier
 
Open source ecosystem evolution open stack and kubernetes models
Open source ecosystem evolution open stack and kubernetes modelsOpen source ecosystem evolution open stack and kubernetes models
Open source ecosystem evolution open stack and kubernetes modelsAntonio Ojea Garcia
 
Libraries Developing Openly
Libraries Developing OpenlyLibraries Developing Openly
Libraries Developing OpenlyNicole C. Engard
 
Free Software Introduction
Free Software IntroductionFree Software Introduction
Free Software Introductionshirish agarwal
 
Foss presentation
Foss presentationFoss presentation
Foss presentationAman Routh
 
Introduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSI
Introduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSIIntroduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSI
Introduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSIVarun Mahajan
 
Introduction to foss
Introduction to fossIntroduction to foss
Introduction to fossAltin Ukshini
 

What's hot (11)

Internet to web: The 40-year old Internet and the 20-year-old Web
Internet to web:  The 40-year old Internet and the 20-year-old WebInternet to web:  The 40-year old Internet and the 20-year-old Web
Internet to web: The 40-year old Internet and the 20-year-old Web
 
The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]
The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]
The Internet and the World Wide Web [Fall 2012 RTF 319 Session 04]
 
Free & Open Source Software (2017 update)
Free & Open Source Software (2017 update)Free & Open Source Software (2017 update)
Free & Open Source Software (2017 update)
 
Open source ecosystem evolution open stack and kubernetes models
Open source ecosystem evolution open stack and kubernetes modelsOpen source ecosystem evolution open stack and kubernetes models
Open source ecosystem evolution open stack and kubernetes models
 
Libraries Developing Openly
Libraries Developing OpenlyLibraries Developing Openly
Libraries Developing Openly
 
Free Software Introduction
Free Software IntroductionFree Software Introduction
Free Software Introduction
 
Foss presentation
Foss presentationFoss presentation
Foss presentation
 
Open source jura CBS (03 11-2010)
Open source jura CBS (03 11-2010)Open source jura CBS (03 11-2010)
Open source jura CBS (03 11-2010)
 
Introduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSI
Introduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSIIntroduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSI
Introduction to GNU/Linux, Free Software, Open Source Software, FSF, FSM, OSI
 
FOSS
FOSS FOSS
FOSS
 
Introduction to foss
Introduction to fossIntroduction to foss
Introduction to foss
 

Similar to A system that works for me

Linus Case Synthesis Essay
Linus Case Synthesis EssayLinus Case Synthesis Essay
Linus Case Synthesis EssayKim Moore
 
ppt on linux by MUKESH PATEL
ppt on linux by MUKESH PATELppt on linux by MUKESH PATEL
ppt on linux by MUKESH PATELneo_patel
 
Nt1330 Unit 4.3 Assignment 1
Nt1330 Unit 4.3 Assignment 1Nt1330 Unit 4.3 Assignment 1
Nt1330 Unit 4.3 Assignment 1Amanda Reed
 
Perspectives on Open
Perspectives on OpenPerspectives on Open
Perspectives on OpenTim O'Reilly
 
Introduction to linux
Introduction to linuxIntroduction to linux
Introduction to linuxMedhat Dawoud
 
Open Source Software and Libraries
Open Source Software and LibrariesOpen Source Software and Libraries
Open Source Software and LibrariesEllyssa Kroski
 
Linux principles and philosophy
Linux principles and philosophyLinux principles and philosophy
Linux principles and philosophyFa6ma_
 
Welcome to linux base distribution!
Welcome to linux base distribution!Welcome to linux base distribution!
Welcome to linux base distribution!Md. Taiseen Azam
 
Group motivation in open source
Group motivation in open sourceGroup motivation in open source
Group motivation in open sourceTom Hickerson
 
Lecture1 100412095202-phpapp02
Lecture1 100412095202-phpapp02Lecture1 100412095202-phpapp02
Lecture1 100412095202-phpapp02Asma Meo
 
Os revolution reaction paper
Os revolution reaction paperOs revolution reaction paper
Os revolution reaction paperMarklin
 
Introduction to FOSS, SRM University
Introduction to FOSS, SRM UniversityIntroduction to FOSS, SRM University
Introduction to FOSS, SRM UniversityAtul Jha
 
Kernel linux lab manual feb (1)
Kernel linux lab manual feb (1)Kernel linux lab manual feb (1)
Kernel linux lab manual feb (1)johny shaik
 

Similar to A system that works for me (20)

Linus Case Synthesis Essay
Linus Case Synthesis EssayLinus Case Synthesis Essay
Linus Case Synthesis Essay
 
ppt on linux by MUKESH PATEL
ppt on linux by MUKESH PATELppt on linux by MUKESH PATEL
ppt on linux by MUKESH PATEL
 
Nt1330 Unit 4.3 Assignment 1
Nt1330 Unit 4.3 Assignment 1Nt1330 Unit 4.3 Assignment 1
Nt1330 Unit 4.3 Assignment 1
 
Perspectives on Open
Perspectives on OpenPerspectives on Open
Perspectives on Open
 
Introduction to linux
Introduction to linuxIntroduction to linux
Introduction to linux
 
Linux Introduction
Linux IntroductionLinux Introduction
Linux Introduction
 
Open Source Software and Libraries
Open Source Software and LibrariesOpen Source Software and Libraries
Open Source Software and Libraries
 
Linux principles and philosophy
Linux principles and philosophyLinux principles and philosophy
Linux principles and philosophy
 
Coacpxp
CoacpxpCoacpxp
Coacpxp
 
Welcome to linux base distribution!
Welcome to linux base distribution!Welcome to linux base distribution!
Welcome to linux base distribution!
 
Torocpsummit 130116180230-phpapp02
Torocpsummit 130116180230-phpapp02Torocpsummit 130116180230-phpapp02
Torocpsummit 130116180230-phpapp02
 
Group motivation in open source
Group motivation in open sourceGroup motivation in open source
Group motivation in open source
 
Lecture1 100412095202-phpapp02
Lecture1 100412095202-phpapp02Lecture1 100412095202-phpapp02
Lecture1 100412095202-phpapp02
 
FLOSS & OER
FLOSS & OERFLOSS & OER
FLOSS & OER
 
Os revolution reaction paper
Os revolution reaction paperOs revolution reaction paper
Os revolution reaction paper
 
Linux
LinuxLinux
Linux
 
GNU turns 30
GNU turns 30GNU turns 30
GNU turns 30
 
Introduction to FOSS, SRM University
Introduction to FOSS, SRM UniversityIntroduction to FOSS, SRM University
Introduction to FOSS, SRM University
 
Kernel linux lab manual feb (1)
Kernel linux lab manual feb (1)Kernel linux lab manual feb (1)
Kernel linux lab manual feb (1)
 
linux.ppt
linux.pptlinux.ppt
linux.ppt
 

Recently uploaded

Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...
Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...
Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...Scott Andery
 
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...panagenda
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...Wes McKinney
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch TuesdayIvanti
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Hiroshi SHIBATA
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterMydbops
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesThousandEyes
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality AssuranceInflectra
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfpanagenda
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI AgeCprime
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfIngrid Airi González
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 

Recently uploaded (20)

Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...
Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...
Enhancing User Experience - Exploring the Latest Features of Tallyman Axis Lo...
 
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
Why device, WIFI, and ISP insights are crucial to supporting remote Microsoft...
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
The Future Roadmap for the Composable Data Stack - Wes McKinney - Data Counci...
 
2024 April Patch Tuesday
2024 April Patch Tuesday2024 April Patch Tuesday
2024 April Patch Tuesday
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024Long journey of Ruby standard library at RubyConf AU 2024
Long journey of Ruby standard library at RubyConf AU 2024
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL Router
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
 
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance[Webinar] SpiraTest - Setting New Standards in Quality Assurance
[Webinar] SpiraTest - Setting New Standards in Quality Assurance
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
 
A Framework for Development in the AI Age
A Framework for Development in the AI AgeA Framework for Development in the AI Age
A Framework for Development in the AI Age
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdf
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 

A system that works for me

  • 1. “A system that works for me.” - an antropological fieldwork in a computer operating system Andreas Lloyd 2009 My interest in the Ubuntu Linux system began in October 2004 as I was browsing on the Internet looking for a new laptop computer. I noticed that some shops offered to sell me a computer with no operating system. I decided that I could save money by installing Linux, and after some trial-and-error, I managed to install Ubuntu.
  • 2. I was happy to find it working, but unnerved at the newness of it all: the file system was full of opaque folder names, changing a setting required typing esoteric commands, and whenever I started up or shut down the computer, it spouted hundreds of lines of technical messages like those featured above. If computers have souls, then the operating system is the eye through which the soul of the machine is reflected. And what and how much it reflects, is wholly up to the designers of the operating system. As I got used to the system, I became more and more intrigued with the transparency of it: All the technical details were laid bare before me, revealing the immense complexity of the operating system and the effort and technical knowledge that had gone into its making – all of which I had simply taken for granted until now, and which now offered itself for interpretation – if I knew how.
  • 3. All of this showcased technical expertise hinting at the huge software engineering effort stood in sharp contrast to the glossy login screen that followed the boot sequence. I soon learned that the transparency within the Ubuntu stemmed from its being open source software. Open source refers to the source code – which is what the programmer writes. And its being open signifies that it is available for anybody to edit and improve - if they know how. But though I had all of the blueprints, all the technical details available to me in the form of these millions of lines of open source code, I did not have the esoteric engineering knowledge necessary to appreciate nor modify it. So instead, I turned my interest to the people who did, curious to learn more about how they collaborate to build such an intricate system.
  • 4. In April 2006 I initiated anthropological fieldwork following the development cycle of a new version of Ubuntu, contributing and learning among the Ubuntu developers. I spent 6 months participating in the development of Ubuntu, including a complete Ubuntu development cycle from June to November 2006. I sought to integrate the on-line and in-person aspects of my fieldwork as closely as possible, and following my first in-person meeting with the Ubuntu hackers at the Ubuntu Developer Summit in Paris in late June 2006, I sent out a fieldwork visit request to 22 of the Ubuntu developers I had met there. Between August and November, I gradually spent more and more time visiting the hackers in their own working environments, participating in their everyday life and interviewing them, as well as taking part in the interactions of the whole community on-line.
  • 5. That’s me! Ubuntu Developer Summit, Paris 2006 This is the group photo from the Ubuntu Developer Summit in Paris 2006. It is conferences like these, where all of the Ubuntu developers came from all over the world to spend a week together that gave me the sense that this was a proper anthropological setting. A digital village emerging in a Paris airport hotel for a week of hacking, discussing and socialising.
  • 6. Andreas Lloyd: “A system that works for me” - an anthropological analysis of computer hackers’ shared use and development of the Ubuntu Linux System (2007) A big part of the resulting thesis focused on examining the Ubuntu hackers' day-to-day practices, and the main argument of the thesis is that the Ubuntu hackers’ shared use and development of the Ubuntu system constitutes a community of practice around their collaborative work and commitment to the project. By positing the Ubuntu community as a community of practice, I explore how the Ubuntu hackers are using new technical and social means to manage and share knowledge and skills on-line, and how these means of learning and sharing are reflected in the system itself.
  • 7. 1. How has Ubuntu come about? 2. How is Ubuntu being developed? (anthropologically speaking, that is) Now, Ubuntu as a community of practice is very interesting. But most of all, that is not the parts that I’ll be focusing on in this talk. Rather, I’ll be talking about two things: One: The cultural history of the Ubuntu system and how it has come about. Two: How Ubuntu is being developed through the shared practices of the Ubuntu developers. - and mind you, this is an anthropological study. Not an attempt at software history or a computer scientific exploration of open source development.
  • 8. 1. How has Ubuntu come about? So how did the Ubuntu system come about?
  • 9. Ubuntu is Free Software Well, the first thing to realize about Ubuntu is that it is Free Software. What does that mean exactly? Well, it is much the same as open source software though the differences are subject to a prolonged discussion among the proponents of each term.
  • 10. Christopher Kelty: Two Bits - the cultural significance of Free Software (2008) I use the term Free Software because that is what the Ubuntu hackers call their work. And because it makes it easier to link to this great book by Christopher Kelty, which explores the cultural history of Free Software. I have a chapter on the cultural history of Ubuntu in my thesis, and I must admit that Kelty’s work is much more thorough, detailed and well-argumented than mine. So I’m happy to use his work here.
  • 11. Five core practices of Free Software In his book, Kelty identifies 5 core practices of free software. All of these practices are instrumental in the development and dissemination of the Ubuntu system, so I’ll walk through these in turn, using them as a way to illustrate the cultural history of Ubuntu.
  • 12. Sharing source code Sharing source code is the most basic practice of free software. In order for the other practices to make sense, this first one must be in place.
  • 13. Dennis Ritchie & Kenneth Thompson Unix, 1969 Sharing source code is something hackers and software developers always have done. One of the primary joys of writing good software is being able to share it with peers, win their recognition and even have them use the software themselves. When Dennis Ritchie and Ken Thompson wrote the Unix operating system in the late 1960s and early 1970s (the architecture of which Linux and Ubuntu still uses today), they naturally wanted to share the source code with anybody who might be interested.
  • 14. Due to some historic coincidence, Ritchie and Thompson developed Unix while working at Bell Labs, a research department of AT&T. At that time, AT&T had a monopoly status on the American telecom industry and was not allowed to enter competing fields. Thus, the company couldn’t really make money off Unix. And so, Ritchie and Thompson could spread tapes with copies of the operating system to peers at universities and other companies all around the world without much expenditure. By the early 1980s Unix was the preferred system on university and business computers alike.
  • 15. ...they were living bodies of narrative that many people knew by heart, and told over and over again – making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones picked up by others, polished and improved and, over time, incorporated into the story. - Neal Stephenson In the beginning was the command line (1999) This open sharing helped Unix grow and improve, as other developers began to develop new software and improve the existing software for Unix. As the science fiction writer Neal Stephenson describes it, Unix in the 1970s and early 1980s became something akin to the Gilgamesh epic: An “oral tradition” - a living body of software narrative that was continually and cumulatively improved with each new version.
  • 16. Conceptualizing open systems Another core practice of free software is the conceptualization of open systems. What this means is conceptualizing and building systems in such a way that everybody are free to use and implement these systems without approval of some formal institution.
  • 17. Vint Cerf, Jon Postel, Stephen Crocker TCP/IP, 1975 The classic example of such an open system is the TCP/IP protocol, which was developed by the Internet Engineering Task Force - including these 3 distinguished gentlemen - from the mid-1970s. By publishing all of the specifications and having open discussions on the nature of the systems, they helped ensure its openness. And despite the fact that another networking standard had already received approval as an ISO standar (not an easy thing to do), TCP/IP became the Internet networking standard simply because it was already in use, and it was easy to adopt and adapt as needed.
  • 18. We reject kings, presidents and voting. We believe in rough consensus and running code. - Dave Clark Internet Engineering Task Force Dave Clark of the IETF sums up their position quite well: They don’t believe in a standard just because some standards body approve it. They believe in a standard that is the de facto standard because everybody is already using it. Other examples of open standards - most of which haven’t been as spectacularly successful as TCP/IP include the Open Document Format (ODF), the Ogg Vorbis audio format (OGG) and the HTTP protocol of the world wide web.
  • 19. Writing Licenses If you share your code or conceive of open systems, how can you ensure that it remains free and open? You need tailor-fitted software licenses for this.
  • 20. Richard Stallman GNU, 1984 In 1984, AT&T had lost its monopoly status and was broken into a number of smaller companies, one of which could now turn Unix into a cash cow with exorbitant license fees. A hacker at MIT, Richard Stallman found it morally intolerable for an owner of software to say “'I won't let you understand how this works; I'm going to keep you helplessly dependent on me and if you share with your friends, I'll call you a pirate and put you in jail.'” So 1984, he quit his job at MIT in order to devote himself to what he called free software – software that anyone can freely could study, use, modify and redistribute in any manner they please as long as they also freely redistribute their own changes to the source code. He decided that the best way to advance this agenda would be to write a completely free version of Unix, which he, in a typical case of hacker wit, called GNU – a recursive acronym for “GNU's Not Unix.” And building on the same “oral tradition” described by Stephenson, interested hackers soon began to send him improvements and suggestions, which he could incorporate in new releases of the GNU programs.
  • 21. The licenses for most software and other practical works are designed to take away your freedom to share and change the works. By contrast, the GNU General Public License is intended to guarantee your freedom to share and change all versions of a program--to make sure it remains free software for all its users. - Richard Stallman Preamble, GNU General Public License As the project grew, Stallman created the GNU General Public License (GPL) to legally ensure that the GNU programs remained free software. The GPL ensured that users could not modify the rights granted by the GPL to any user, meaning that any GPL software even if improved and combined with non-GPL software could only be redistributed on GPL terms. The American anthropologist Gabriella Coleman points out that with the GPL Stallman cleverly hacked traditional copyright law by inverting it: It was not a question of enforcing an authorial monopoly, but to ensure that his software was freely available and impossible to monopolize, in effect coercing all users of his software to give away their own improvements.
  • 22. Coordinating Collaboration Though Stallman distributed his GNU software on the Internet, he did not use it to open up for greater collaboration, which is probably the core practice for which Free Software is best known today.
  • 23. Linus Torvalds Linux, 1991 In 1991, Linus Torvalds, a young Finnish Computer Science student frustrated at waiting in line at the university Unix terminals, wanted to run Unix on his new PC at home. All he needed to complete the GNU system he had downloaded off the internet was a kernel – the core program which allocates resources to the various applications and users present on the system. He began writing such a kernel over the summer holidays that year. Since Finland had invested heavily in telecommunications infrastructure, he was one of the first to have Internet access from home, and on October 5th, he announced his kernel project on the Internet:
  • 24. This is a program for hackers by a hacker. I've enjoyed doing it, and somebody might enjoy looking at it and even modifying it for their own needs. It is still small enough to understand, use and modify, and I'm looking forward to any comments you might have. - Linus Torvalds Announcing Linux on Usenet, 1991 What was remarkable about Linux was that Torvalds embraced the fact that the Internet allows people from all over the world to collaborate instantly whereas earlier, such collaboration had been limited to the slow and random “oral history” exchange of tapes and disks. He used the Internet to constantly release new, rough versions of the code for others to test simply by making it available for download on-line, incorporating patches – code improvements – and suggestions which other hackers sent to him, discussing designs and plans on mailing lists and giving them the opportunity to add their own changes as they proved themselves technically capable.
  • 25. Fomenting Movements The final core practice of free software is fomenting movements. Again, this was to a great extent developed by Richard Stallman as he fought for software freedom in the early 1980s, but it really come to the fore with the rise of Linux distributions. Throughout the 1990s, it was still a formidable task to get Linux installed and functional on ones computer, as it involved configuring and maintaining everything yourself. To make it easier to install and run Linux, distributions of Linux began to appear, integrating much of the new software with GNU/Linux for easier installation and maintenance.
  • 26. Ian Murdock Debian, 1993 Initiated in 1993 by Ian Murdock, Debian is one of the great Linux distributions, seeking to keep a complete catalog of all the free software available online.
  • 27. We will make the best system we can, so that free works will be widely distributed and used. - Debian Social Contract But unlike many of the other Linux distributions, which were owned and developed by companies, Debian was a community-driven effort with hundreds of developers gathering under the promise of the Debian Social Contract to “make the best system we can, so that free works will be widely distributed and used.” Debian is not merely a system or a distribution of software. It is a movement dedicated to distribution and dissemination of free software.
  • 28. This is a genealogy of sorts of the various Linux distributions in existence by 2007.
  • 29. As one of the old Linux distributions, Debian has spawned a wide number of other Linux distributions, which base their efforts off Debian’s codebase.
  • 30. Ubuntu is one such heir, branching out of the main line of Debian to produce a system with a somewhat differing set of goals than the main Debian system.
  • 31. Mark Shuttleworth Ubuntu, 2004 In 2004, former Debian developer - and recent dot-com IT millionaire - Mark Shuttleworth hired 12 prominent free software developers, among them many of the most active Debian developers to work on a new Linux distribution.
  • 32. Microsoft has a majority market share in the new desktop PC marketplace. This is a bug, which Ubuntu is designed to fix. - Ubuntu bug no 1 Filed by Mark Shuttleworth Thankful for all the free software tools that had made his success possible, Mark Shuttleworth wanted to give back to the free software communities, and having seen how proprietary software like Microsoft Windows was forcing countries like his native South Africa to spend millions on software licenses, and in disappointment at unwillingness of the commercial Linux distributions to challenge Microsoft for more than a minimal market share, Shuttleworth made it his lofty goal to break Microsoft's majority market share of PC operating systems, as he later stated in the first bug report to be entered into the Ubuntu bug tracking system.
  • 33. Non-free software is holding back innovation in the IT industry, restricting access to IT to a small part of the world's population, and limiting the ability of software developers to reach their full potential. - Ubuntu bug no 1 Filed by Mark Shuttleworth By casting Microsoft's monopoly on desktop PC operating systems as a bug to be fixed, Shuttleworth defines proprietary software as active impediments to software innovation. By positioning Ubuntu as the solution to this bug, Shuttleworth put the core practices of Free Software at the centre of software innovation, reflecting this in the name Ubuntu – an ancient Zulu word meaning “humanity towards others” whose associated ideology of shared humanity provides a humanistic basis for the free software movement that he is championing.
  • 34. Sharing source code Conceptualizing open systems Writing licenses Coordinating collaboration Fomenting movements So, to sum up: The Ubuntu system comes from these five core practices, and it is continuing the development of these practices into the 21st century.
  • 35. 2. How is Ubuntu being developed? So let’s look just how they’re doing that. But in order to so, perhaps it’s necessary to get an idea of just how big the Ubuntu system is.
  • 36. The Ubuntu system is an abstract thing, consisting of thousands of lines of source code. But computer source code has a unique quality of both being abstract information as well as being able to have a concrete effect in the world. Software developers are often called software engineers, and for good reason. A good way to think of the Ubuntu system is as a huge engineering effort of accumulated complexity managed and continually developed by its users.
  • 37. Empire State Building Ca. 3500 man years Let’s look at some other engineering works of some size and complexity... like the Empire State Building
  • 38. The Panama Canal Ca. 10.000 man years Let’s look at some other engineering works of some size and complexity... like the Panama Canal
  • 39. Debian in 2005 230.000.000 lines of code estimated 60.000 man years In 2005, a group of Spanish computer scientists counted the amount of code in the whole Debian system. It resulted in a total of around 230.000.000 lines of code. Using an industry standard of estimating programming labor, they estimate that the Debian system at that point was the result of 60.000 man years. All of this work, obviously having taking place in the previous 20 years or so. Some might object that this is comparing apples and oranges, and it would be more interesting to compare the Debian system to an intellectual work such as an encyclopedia or a library. I’d agree, but for some reason, buildings and canals are just a lot more impressive: It’s more than six times the effort of digging and constructing a channel between two of the biggest oceans in the world. That’s BIG.
  • 40. We build our computers the way we build our cities -- over time, without a plan, on top of ruins. - Ellen Ullman Close to the machine (1997) In fact, when you think about it, the only other works of human engineering of comparable size are cities. And as the programmer-turned-writer Ellen Ullman puts it, we build our computers - and our computer systems - in much the same way as we build our cities.
  • 41. Ubuntu consists of more than 20.000 software packages. And indeed, Ubuntu, inheriting the modular design of Debian, is constructed much like a city: Consisting of more than 18.000 software packages, some big, some small, and all of them part of an intricate web of complex dependencies. Much like a carefully assembled model: Some parts needs to be in place for others to work, and some parts depend on function supplied by packages on which they aren’t even directly dependent.Much like a city and its interconnected web of services and infrastructure.
  • 42. Upstream Free software projects Debian unstable Ubuntu release Downstream User’s system Each software package in Ubuntu is a piece of software developed outside of Ubuntu by other developers in separate free software projects. Since the developers involved with these projects rarely are directly involved with Ubuntu, they are often called the upstream for distributions like Debian and Ubuntu. Since Ubuntu depends on the Debian development version known as Debian Unstable to gather the latest versions of all of the free software in the whole free software eco-system, Debian is also the upstream for Ubuntu, which in turn is the upstream for the actual users In this way, a flow of new software packages is flow of the latest changes from the various free software projects into Debian and from there into Ubuntu. Most of the work of the Ubuntu developers thus consists of packaging and integrating this new software with the existing Ubuntu system.
  • 43. What’s in a software package? Well, and easy example would be an application like the web browser Firefox.
  • 44. Or free software photoshop known as GIMP.
  • 45. Or it could be something more fundamental like a programming language like Python, which other software packages written in Python depend on.
  • 46. ... organizations which design systems... are constrained to produce designs which are copies of the communication structures of these organizations. - Melvin Conway How Do Committees invent? (1968) The modular design of the Ubuntu system is a remnant from Debian, where each software package has its own maintainer, who has complete control over his or her package. Though Ubuntu does not have designated package maintainers in this way, the communication lines are much the same: A distributed system where hundreds of developers can work on separate packages without having to change the rest of the system (well, most of the time, at least). This confirms Conway’s Law that organisations designing software systems produce designs that copy the communication structures of that organisation.
  • 47. Let’s look into how that works. Whenever you install Ubuntu on your computer, the first thing the system does is to check whether there have been further changes to the system by checking the online software package repository. It then proceeds to download the corresponding changes and updating your system. Whenever the Ubuntu developers fix a bug or implement a new feature in a software package, they upload a new version of that package containing those changes to the main Ubuntu archive on-line, against which all on-line Ubuntu systems running the same version of Ubuntu are synchronized through Internet updates. This not only makes it very easy to distribute fixes as they are produced, but also to introduce unexpected problems if these updates aren't thoroughly tested.
  • 48. Stable version 1 Stable version 2 Development version In order to be able to extend the Ubuntu system without risking to break the system for their users through an ill-conceived software update, the Ubuntu developers work on at least three active versions of Ubuntu at any given time: Two stable versions where the Ubuntu developers only update them as new security holes and integral weaknesses are exposed. And one version which is the current development version which, which only developers use.
  • 49. The Ubuntu developers themselves almost always use the development version of the Ubuntu system for all of their work. The development version contains the latest unstable software derived from Debian, and the various upstream free software developers, of whose software both Debian and Ubuntu is comprised. The main development work of the Ubuntu developers is to gradually stabilize more and more of the system, testing and ensuring its quality in time for release.
  • 50. Indeed, one of the main differences between the Ubuntu system and Debian is the pace of software releases. Where it typically takes Debian 18-23 months to release a new stable version, Ubuntu releases a new version every 6 months. As mentioned before, both Debian and Ubuntu releases are based on the permanent development version of Debian called Debian Unstable, which is a snapshot of the latest versions of all the free software in the free software ecosystem.
  • 51. The first thing to realize is that the outside world ran on releases of UNIX (V4, V5, V6, V7) but we did not. Our view was a continuum. - Ken Thompson In this way, Debian Unstable is remarkably similar to the original Unix system at AT&T. Their system was always in development. Releases was only something they did when they had to pass the software on to people outside AT&T.
  • 52. V5 was simply what we had at some point in time and was probably put out of date simply by the activity required to put it in shape to export. - Ken Thompson And as the development of Ubuntu and Debian shows, the effort it takes to release a stable version of the system is so big that it is out of date by the time it has been released, as new software continues to trickle down from the free software projects upstream.
  • 53. Thus, to the Ubuntu developers, the Ubuntu system looks a bit like the Beijing skyline: A place of constant development. Whenever a building is finished, some other part of town is ready to be sanitized or even replaced completely.
  • 54. This perpetual state of development can be illustrated through an examination of the Ubuntu release cycle, which is ... well, cyclical and somewhat circular. Each new version begins life with merging the old version with all of the new software from Debian Unstable, added with the new features specifically designed and developed for that specific version of Ubuntu. All of this very unstable software is then slowly frozen (not open to new feature changes), improved, adapted and stabilised, tested, and further stabilised until the whole system is ready for release. And shortly thereafter, the whole cycle begins anew.
  • 55. ... users know better than operating system designers what their own needs are. - Eric Raymond So how is Ubuntu developed? Well, even though Ubuntu aims at making Linux and free software more accessible for non-technical users, much of the development effort still follows the notion from free software evangelist Eric Raymond that “... users know better than operating system designers what their own needs are.” He said it of Unix, but much the same is true of Ubuntu. The Ubuntu system is primarily developed to match the needs of the users who are capable of developing it: The Ubuntu developers themselves. Eric Raymond also said that every good piece of software starts with a developer scratching his own, personal itch. These itches are indeed contracted through the developers’ own use of the system.
  • 56. ... users know better than operating system designers what their own needs are. - Eric Raymond That said, the Ubuntu developers do attempt to work with interaction designers and usability experts to improve the system’s user interfaces in general. But such design and planning is not core to the way they develop the system, as we shall see. They also make a keen effort to open up development work and make it as easy and accessible as possible for newcomers to get involved scratching itches of their own. This effort is also very fascinating, but it relates directly to my analysis of Ubuntu as a community of practice, which is another, long story of its own.
  • 57. So, how do the ubuntu hackers use the Ubuntu system? Their work is a balanced duality between focusing intensely on their programming work and constantly scanning the periphery for new information to relate to. This duality is summed up quite nicely by the way Ubuntu hacker Martin has organized his workspace.
  • 58. He has set up two IRC clients monitoring the channels he is most interested in, and on top of them, he has placed his main tool, the command line terminal in such a way that the last 10 lines of conversation are displayed from the IRC clients, allowing him effortless participation in the constant flow of community chatter as part of his work on the Ubuntu system.
  • 59. Martin uses the command line for most of his work. It gives him a clear and transparent means of interacting with the computer through the concise and extensible commands he inputs, it also allows him to spread his work beyond his own machines, as he can use it to log on to computers related to other free software projects around the world or to his own server halfway across the country, with the abstract and transparent textuality of the command line offering him the same level of control of those remote systems as if he were there in person.
  • 60. When programming and working intensely on the computer, hackers often enter a state of uninterrupted concentration where he loses himself from the outside world, focusing solely on the computer to the point where the interaction with the computer happens effortlessly, allowing him to immerse himself in the code to the point where "You don't need to think about the variable names, they will just appear in your mind when you need them," as one Ubuntu hacker puts it. This state is very similar to what the American psychologist Mihali Csikszentmihalyi calls flow: “the state in which people are so involved in an activity that nothing else seems to matter.” It is worth noting how such diverse activities as programming, playing computer games or writing essays all depend on the computer's continuous feedback and need for interaction, its minimal requirement of physical exertion, and its huge mindful potential to induce and maintain the flow state for hours on end. The flow state requires complete concentration and control, pooling all your attention in a tunnel-like vision focused on the screen, making you willfully unaware of your surroundings, including the physical actions you perform to interact with the computer, empowering and guiding you only towards the work on the screen.
  • 61. ... the re-arrangement of tiles on the tray is not part of action, it is part of thought - Clark & Chalmers In this way, the interaction with the computer is similar to the American philosophers Clark & Chalmers' notion of “active externalism” through which they argue that certain mental tasks depend on physical objects to be solved efficiently. They give the example of the letter tiles in a Scrabble game where we depend on physically moving the tiles in order to win new perspectives on the possible words to be spelled rather than on some internal process on our own. Thus, “the re-arrangement of tiles on the tray is not part of action; it is part of thought.” (Clark & Chalmers 1998:7; emphasis original). In this way, the computer and the system through which it works becomes an extension of the hacker's mind, it is only through constant, uninterrupted interaction and focus on the computer that he can solve the task at hand.
  • 62. Although this mode of working seems highly individual, there is a constant flurry of interaction taking place through the computer. Each developer attends IRC channels and subscribes to mailing lists reflecting their different interests around the community. Most of these discussions are essentially long question and answer sessions, as hackers turn to fellow hackers as well as manuals, web searches and documentation to find the answers necessary to retain their mastery of the computer. The inability to be interrupted makes hackers somewhat asynchronous to one another – at least in the short term. This is reflected clearly by the fact that all of the Ubuntu hackers' preferred on-line communicative means are textual and thus – at least to some extent – asynchronous. Email, newsgroups and web forums postings and bug tracker comments are all based on its users reading and replying asynchronously. Even real time communications such as IRC chat channels and Instant Messaging bend to this rule as developers “ping” each other, and if there's no immediate response, they can ask their question and let the other part answer when he has time or attention to spare.
  • 63. ... 09:00 carlos pitti: ping [...] 09:07 pitti carlos: pong 09:08 carlos pitti: I did a mistake yesterday night and latest Edgy export has the plural form bug (bug #2322) 09:08 Ubugtu Malone bug 2322 in rosetta "Truncated plural forms" [Critical,In progress] http://launchpad.net/bugs/2322 09:09 carlos pitti: I'm exporting a new version with that fixed, but it would take around 2-3 hours 09:09 carlos am I late to have it in the prerelease version? 09:09 pitti carlos: ah, then I'll rebuild the edgy packs this afternoon 09:09 pitti carlos: it won't go into RC anyway 09:09 carlos ok 09:09 pitti carlos: the plan is to upload the final packs tomorrow morning 09:10 pitti carlos: thus I'd like to have today's in perfect shape 09:10 carlos I see 09:10 carlos ok 09:10 carlos pitti: I will ping you when the new version is available Here, Carlos needs to notify Pitti of a new bug which he needs to take into consideration when building a group of packages for upload. Since Pitti is busy, the conversation doesn't continue until Pitti is able to respond and they can coordinate their work. Most of the Ubuntu hackers' day-to-day interaction takes place on IRC where they can pick up on interesting discussions and be available if someone needs to ask a question. The hackers deftly navigate back and forth between conversations, fluidly participating as the IRC client automatically notifies them when someone “pings” them or even just mentions their on-line moniker. And even if they miss something, they can always go back to check the chat logs or mail archives as all interactions within the community are recorded and publically archived on-line.
  • 64. ... people are not always in the public, but they are always ready to be. - Christopher Kelty Geeks, Recursive Publics and Social Imaginaries (2005) As media theorists Geert Lovink and Ned Rossiter have noted, on-line life is defined by passive consumption of information, as it would be impossible to relate and reply to all the information that passes by. This can literally be seen in the mailing lists and the IRC channels which the individual Ubuntu hackers subscribe to. Most of these they are interested in, but not so deeply to participate actively and regularly – unless a topic comes up which they care deeply about. Instead, they spend most of their time and energy on relatively few mailing lists and IRC channels, where they participate much more actively as they are well acquainted with the people there, and where they contribute with most of the discussion, joking and socialising to which others can relate passively. As Chris Kelty remarks “people are not always in the public, but they are always ready to be.”
  • 65. “I got a system that works for me.” This fine-tuning of the system to the needs of the individual hacker was reflected in the Ubuntu hackers’ answers to the question “What do you get out of developing Ubuntu?” Almost all of them answered: “I get a system that works for me.” A system that not only supports their individual work, but also their shared exchanges online. Thus, the hacking practices of Ubuntu hackers are not the work of lone engineers, but rather a complex interplay between hacking work and social interaction on-line (as well as in-person).
  • 66. Howard Rheingold: Tools for thought (1985) This use of the computer is not all that unexpected. As Howard Rheingold’s 1985 history of the computer, remarkably titled “Tools for thought” explains, a number of the people involved in developing the computer as we know it today wanted to develop a tool for thought - a machine that could support and help man deal with the increasing complexity and speed of the modern world.
  • 67. By ‘augmenting human intellect’ we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems. - Douglas Engelbart Augmenting Human Intellect (1962) One of the leading visionaries in developing the computer as a tool for human thought was Douglas Engelbart, whose project to “augment human intellect” focused on the computer. He was one of the first to realize that the computer not only allowed for the individual computer user to handle complexity on his own, but also to collaborate with other users, thus augmenting their collective intellect further. Engelbart was a pioneer of online collaboration, creating the first proper interactive collaborative setup in 1968. In many ways, the Ubuntu hackers are among his intellectual heirs, as they have internalized his work to a great extent in their collaborative practices online.
  • 68. “I got a system that works for me.” But back to the individual hackers’ use of system: How does their own individual use and adaptation of the system to their personal needs affect the system as a whole?
  • 69. ...It's like moving from renting a house to buying a house – you begin to look at it in a different way. You begin to notice all the places where there is room for improvement. - Colin Watson Well, Ubuntu hacker Colin Watson gave this very interesting follow-up explanation as to what he got out of working on the Ubuntu system.
  • 70. Part of it is commitment... having and taking the responsibility to make things work, but it is also having the opportunity to do so. You decide for yourself. - Colin Watson Here, Colin sums up what most of the Ubuntu hackers told in some form or another: Working on Ubuntu is about deciding to make a commitment and taking responsibility for developing the system. Much like any house owner decides to spend time on maintaining his house. The central point here is that it is not something that is forced upon them. Spending time maintaining and building the system rather simply just using it is a hobby, an interest that not only offers new insights, but also new social relationships with other builders.
  • 71. Stewart Brand: How Buildings Learn (1994) Stewart Brand’s book “How Buildings Learn” examines how buildings evolve after being constructed and after people start living in them. A key concept in that book is that of “shearing layers”. Brand explains how different parts - layers - of a building evolve at different paces. These are - from slowest to most rapid change: 1) The site where the house is built - nothing short of major climate change or earthquake will change it. 2) The structure of the house - once set in place, the basic frame of the house is difficult and expensive to change. 3) The skin of the house - the facade and roof of the house, which is its face to the outside world. 4) The internal services of the house - the water pipes, electricity, gas, broadband and TV cabling and such. 5) The space plan of the house - how the walls and doors and windows are set up. 6) The actual stuff of the inhabitants of the house, the furniture, the books, the cupboards and closets full of clothes and kitchenware. Each of these shearing layers has its own flow and changes at its own pace. A house is in constant, albeit very slow development, adapted to the needs of its inhabitants. I would argue that a computer system like Ubuntu works in much the same way.
  • 72. Jonathan Zittrain: The future of the Internet (2008) I’d argue that a computer system like Ubuntu consists of shearing layers similar to that of a house. This illustration is from Jonathan Zittrain’s book “The future of the Internet (and how to stop it)”, and shows the layers of technology which we take for granted in using a computer (right) and the internet (left). At the bottom is the hardware - the copper and fiber of the wires, the chips of the computers - in the middle are the open systems - the internet protocol and the operating systems - and at the top are the actual applications that we use - our everyday stuff.
  • 73. Generativity is a system's capacity to produce unanticipated change through unfiltered contributions from broad and varied audiences. - Jonathan Zittrain The future of the Internet (2008) The hardware platform and the open systems built on top of them aren’t not controlled by a single entity like a government or a company, but rather distributed among everybody on the internet and with a PC of their own. Zittrain’s core point is that this lack of centralized control makes these systems generative. Generative in the sense that they allow for anybody - large and varied audiences - to adapt them to their own needs, creating unanticipated change through unfiltered contributions. Since the platform - both the PC and the Internet - are open and generative, there are no single entity to prevent people from building whatever they pleased on top of it. And that is exactly what characterizes free software like Ubuntu.
  • 74. You're doing your task, or a task you've done before, and at some point you wonder ‘why do I always need to type five different commands to achieve this task which I'm doing over and over again?’ - Martin Pitt It is the generativity of the Ubuntu system which gives the Ubuntu developers the opportunity to adapt it to their own needs to the extent that they seek and enjoy. Martin gives a good example of how he uses this generativity in his daily work: Why does he have to type the same thing five times every time?
  • 75. And so, some times when you have nothing urgent to do, you just think about the details and say ‘let's factorize this into a function’ or ‘let's factorize this into a shell script.’ - Martin Pitt And so he adopts the system in a way that closed and sterile systems - like the Facebook platform or the iPhone can’t allow: He adapts the system to his needs by making a script that turns those five commands into just one.
  • 76. It’s like the “desire paths” you see in parks or on university campuses, which are the short cut routes people actually walk rather than the ones designed by the planners of the park. Taking the opportunity to make these short cuts yourself based on your own needs and interests rather than being dependent on the ones made by design. For Ubuntu developers, the greater extent to which you can match your system to the desire paths of your mind, the better the system will augment your intellect and help you in your work.
  • 77. The command line is basically like an organism that over time you can dynamically adapt it to your needs and to your current tasks... - Martin Pitt Martin has been building his own desire paths in his system through more than 8 years of constant use, adding scripts and short-cuts and fitting this tool for thought to his thoughts. This is especially takes place at the command line level, which is the most flexible and transparent representation of the system, which allows Martin and other Ubuntu hackers to build a shared repertoire of tools and routines spanning almost all of their mutual engagement in the project, including programming, managing their emails and communications, emulating and testing other systems, and accessing other computers on the Internet, thereby constituting an entire digital landscape which they can adapt for their needs.
  • 78. ... This is what makes it so much fun, because you can never do this with GUI [Graphical User Interface] applications. I mean, it might be easy to learn them, but it always take the same time to do, so it is hard to factorize common tasks to them. - Martin Pitt This also explains Martin and other Ubuntu developers’ hesistancy to use graphical user interfaces, for though they make it easier to learn and use, they are much harder to adapt and personalise to your individual needs.
  • 79. To build is in itself already to dwell ... Only if we are capable of dwelling, only then can we build. - Martin Heidegger The German philosopher Heidegger argued that we are always dwelling, whenever we use and inhabit our environment, and it is only by dwelling that our environment can be understood and thus organised in the mind. Building on this, the British anthropologist Tim Ingold argues that building cannot be understood as a simple process of realising a pre-existing architectural design through raw materials – it is only through dwelling, relating to the world one inhabits, that building is possible. Thus, Ingold, argues, as children grow up in environments furnished by the work of previous generations, they dwell in a built environment that is shaped through human intentionality, and through that environment they adopt “specific skills, sensibilities and dispositions” with which they in turn continue build the environment (Ibid:185-186). I contend that a similar process of dwelling is at play in the Ubuntu system. Computer systems are like cities: Layers upon layers of human intentionality that evolve as new inhabitants move in, constantly re- exploring and redefining its structures and applications, adapting and re-designing them for new purposes, maintaining them differently as needs change. And the Ubuntu system, and especially the command line at its heart, stems from the almost 40-year old “oral tradition” of Unix technology being refined and redesigned for varying needs. Thus the system, like any other built environment, has been shaped by the activities of several generations of predecessors whose work is now reflected in the skills, dispositions, and dwelling of the Ubuntu developers. When they adopt and learn the details of the system, customising it to fit their needs, they begin to dwell within the system, internalising their work flow and social relations to their shared community directly in the system itself, turning into an extension of their mind.
  • 80. Umwelt Jacob von Uexküll: A Stroll Through the Worlds of Animals and Men (1957) In his book “Stroll through the Worlds of Animals and Men”, the Estonian biologist Jakob von Uexküll shows the manifold inhabitants of the tree: There is the fox building his lair at its roots, the owl perched on high on its branches, the squirrel foraging nuts in the intricate maze of the treetop, the various insects making their home in the bark and many others. Each of these animals perceive and act differently in the environment of the oak tree, and von Uexküll defines the sum of each animal's actions and perceptions of the tree as its Umwelt. Von Uexküll argues that each animal is unable to detach its consciousness from its Umwelt, and thus see the tree as a neutral object – they will invariably attach their own activities to it. The British anthropologist Tim Ingold argues that a house is inhabited in much the same way as von Uexküll's oak tree – that houses, too, are living organisms, constantly being built and rebuilt through the dwelling of its many inhabitants – human as well as non-human – shaping their own Umwelten through the flow of intentional activity.
  • 81. I suggest that like the house or the oak tree, the Ubuntu system is an incredibly big and complex system with the potential to cover a constantly increasing number of configurations, uses and needs. Each user of Ubuntu values different functions of the system and has different motivations for using it, and through this use, each user comes to adapt the overall, standard system to his own personal work flow and interests, shaping it into a, personal Umwelt, distinct and separate in use and configuration from that of other users, drawing upon the thousands of software packages to build out his own nest in which to dwell.
  • 82. It doesn't matter how acquainted you are with a particular brand of computer. When you need to borrow someone else's, it feels like you're using their toothbrush. The instant you turn a friend's computer on, it's there: that strange arrangement of familiar parts (why do they do it like that?), the whole disorienting logic of a place you thought you knew. You kind of recognize it. There's an order here. Then, a moment of terror. You are...peering into someone else's mind! - Kevin Kelly Out of Control (1994) This intimate relationship between user and computer gives rise to the anxiety of using somebody elses computer described by Kevin Kelly. The more entertwined and personalized the system is to a specific user, the more alien and strange it will seem to everybody else. Using somebody else’s system is indeed like peering into someone else’s mind.
  • 83. These Umwelten can be as physically and mentally separate from one another as that of a squirrel might be to an owl. From high-powered servers offering critical services to hundreds of users to the cheap desktop PCs being used for games and socialising. Each user adopts the system for his needs, some even contributing back based on that use, in order to scratch their own itches.
  • 84. Or like Og who submits Brazilian translations of text in the Ubuntu packages in order to improve Brazilian language support in Ubuntu.
  • 85. Or like Sebastian who, wanting a better graphical user interface for his favourite package manager, works to improve the user interface of Ubuntu’s package management software.
  • 86. Or like Henrik who, paralyzed from the neck down, thus requiring a specialized “head mouse” and dialing wand to navigate the system, shaping not only his work flow but also his configuration of the system, works to improve the accessibility features of Ubuntu to make the system easy to use for disabled users.
  • 87. Or like Scott, who, unhappy with the boot and reboot times of Ubuntu, works to replace the old Unix Init system that boots up the system, with his own free software project Upstart, which integrates better with the rest of the system.
  • 88. Like Jonathan, who, preferring the KDE desktop environment rather than the GNOME desktop environment that Ubuntu ships as default, works on Kubuntu - a sister distribution to Ubuntu, sharing almost all of the basic functionality, but with a different graphical interface and different desktop applications on top of it. It is all a matter of preference and interest.
  • 89. Thus, in order to grow, the system is dependent on all of the hackers and users individually dwelling, sharing and collaboratively building the system into a constantly evolving organism, like all the different species inhabiting and supporting the eco-system of an oak tree, with each new version of the system building on top of the previous to reach further, supporting more uses and covering more needs.
  • 90. Whenever a Ubuntu hacker uploads a change to the system, he notes the change in the system change log – a central listing in which the developers register all of the changes to the system – and signs it with his digital signature to associate that change with his on-line identity. Thus, at this system-wide level, the Ubuntu hackers communicate their work by changing the system directly, thereby letting their actions and changes to the system tell of themselves.
  • 91. Stigmergic collaboration The Australian new media theorist Mark Elliott calls this kind of collaboration and communication directly through action "Stigmergic collaboration", borrowing a term from French zoologist Pierre-Paul Grassé who first coined the term in the 1950s to describe the way termites interact by building and maintaining their nest. Elliott argues that human stigmergic collaboration has become possible through digital means, as it makes it possible to sidestep social negotiation, making it instantly possible for hundreds of developers to contribute and collaborate without having to get acquainted with and maintain relationships with one another. But since changing the system carries the potential risk of breaking all of the inter-connected Ubuntu systems, each Ubuntu developer needs to have acquired the formal trust of the community through direct approval from the technical board in order to be able to actively change the system.
  • 92. Thus, stigmergic development is driven by an awareness that every change you upload affects the entire system. Not only an immense responsibility, it is a very empowering position, as it allows you to fix a problem that has been bothering you, not only for yourself, but for everybody else who might encounter the same problem. One example of this is Martin who maintains the software package containing the database PostgreSQL.
  • 93. ... whenever I screwed up and did a bug in the new version and then people who upgraded this said "hey, you just broke my installation" [..] This really got me the feeling that everything I do is actually used outside there.... and it is important and people are relying on it... - Martin Pitt This indirect social connection through the use and development of the system is central to Ubuntu hacker sociality. Since it is shared and developed among several hundred hackers with access to change the Ubuntu system, the people it come to depend and trust the developers responsible for the packages they use only to make beneficial and well-considered changes. In this way, the Ubuntu hackers maintain a communal regime of mutual accountability, aware of the responsibility they hold, as well as ready to let others know when their changes break each other’s carefully configured system.
  • 94. Coordination in Free software privileges adaptability over planning. - Christopher Kelty Two Bits - the cultural significance of Free Software (2008) This mutual accountability built through the hackers’ shared use and development of the system is at the core of the collaborative practices of free software. As Kelty does well to conclude: "Coordination in Free Software privileges adaptability over planning. This involves more than simply allowing any kind of modification; the structure of Free Software coordination actually gives precedence to a generalized openness to change, rather than to the following of shared plans, goals, or ideals dictated or controlled by a hierarchy of individuals." I see this pattern very clearly in the day-to-day work of the Ubuntu developers: Sure, there are goals and planning and governance in their work, but the driving goal of the Ubuntu hackers continues to be to build "a system that works for me" - a system that matches their personal practices with the computer. A system that is continually and cumulatively improved through the shared effort of the Ubuntu hackers, each adapted the default system to his or her own needs, extending and developing it as needed along the way.
  • 95. Debugging, in this perspective is not separate from design. Both are part of a spectrum of changes and improvements whose goals and direction are governed by the users and the developers themselves... - Christopher Kelty Two Bits - the cultural significance of Free Software (2008) The key element in free software development is the ability to see development of software as a spectrum rather than a series of releases. This implies more than just continuous work on a product; it means seeing the product itself as something fluid, built out of previous ideas and products and transforming, differentiating into new ones. Debugging, in this perspective is not separate from design. Both are part of a spectrum of changes and improvements whose goals and direction are governed by the users and the developers themselves. In this way, Free Software is a continuing praxis of "figuring out" - giving up a sense of finality in the product in order to continually adapt and redesign the system. Kelty argues “that successful free software projects are “experiments with adaptability - attempts at figuring out - that have worked, to the surprise of many who have insisted that complexity requires planning and hierarchy.”
  • 96. A recursive public This constant self-referential experimentation, “figuring out”, and adaptation of the Ubuntu system results in what Christopher Kelty calls “a recursive public”: A recursive public is a public "whose existence (which consists solely in address through discourse) is possible only through discursive and technical reference to the means of creating this public."
  • 97. That is to say: The Ubuntu community exists solely to talk about and coordinate the continued development of the Ubuntu system, which is also the means to develop Ubuntu, which is also the means through which they are able to discuss and develop Ubuntu. It is a self-referential, recursive loop of discourse. Ubuntu is recursive through its discourse about technology (the Ubuntu system), which is also the technological means that make that discourse possible. Furthermore, this technology consists of many recursively dependent layers of technical infrastructure: The applications, the operating system, the internet protocols... Indeed, the depth of recursion is determined by the openness necessary for the project itself.
  • 98. In this way, Ubuntu and other free software projects are very similar to Douglas Hofstadter’s notion of strange loops: With each new development cycle, the system and the community goes further than ever before, figuring out and adapting the system to their needs, only to return to the starting point: The unstable continuum of software-in-development from whence they started out. A state of permanent development from which a new version branches out, wiser and better than before. But the system and the community itself is never finished. It is a technological ecosystem of sorts: The individual developers depend on the system to augment their intellect in their daily work. The developers depend on each other to ensure that the system as a whole continues to function. And the system itself depends on the developers’ continued developing and maintaining it.
  • 99. Age plus adaptivity is what makes a building come to be loved. The building learns from its occupants and they learn from it. - Stewart Brand How Buildings Learn (1994) In his book, “How buildings Learn”, Stewart Brand concludes that “Age plus adaptivity is what makes a building come to be loved.” - it is the strange loop of shared learning between the various occupants shaping the building, and the building shaping them in turn that results in good buildings, loved and lived in over the centuries.
  • 100. Age plus adaptivity is what makes a system come to be loved. The system learns from its users and they learn from it. Based on my fieldwork, I would argue that the same could be said of computer systems - it is the strange loop of shared learning between the various users shaping the system, and the system shaping them in turn that is at the core of a good computer system. We have lived in houses for hundreds of years, adapting them to our changing needs. We have only had computer systems in 40 years or so. But already now we are finding ways and processes of building and maintaining systems together that may in such a way that they can adapt endlessly to the test of time.
  • 101. ? This leaves room for a lot of interesting questions and themes for further exploration: - How will we as computer users change as a result of becoming steadily more dependent on computer systems like Ubuntu to augment our intellect? - What role will average users (not developers) uncapable of adapting the details of the system to our personal needs play in the development of the Ubuntu system? - How does one become an Ubuntu developer (this question links with my work on Ubuntu as a community of practice, which I’ll have to do another presentation on at some point)? - What is the broader cultural significance of the practices of free software development in relation to how we collaborate and organize ourselves in other contexts? - If code is law, as Lawrence Lessig and others have stated, how can the development and governance of operating systems be analogous to, for instance, the development of the democratic constitutions? - If technology becomes a bigger part of other public spheres - such as public administration and decision making, will they have to become recursive publics in their own rights? - How would a lack of finality - the notion of being in a fluid continuum - a state of permanent development - affect other public spheres?
  • 102. Thank you! email: andreas.lloyd@gmail.com weblog: http://www.andreaslloyd.dk thesis: http://www.andreaslloyd.dk/thesis twitter: @andreaslloyd http://www.flickr.com/photos/gareandkitty/237547991 http://www.flickr.com/photos/stuckincustoms/201034654 http://www.flickr.com/photos/stuckincustoms/302644832 http://www.flickr.com/photos/noirin/177599802 http://www.flickr.com/photos/omar_eduardo/127707517 Photo Credits: http://www.flickr.com/photos/wallyg/158337907 http://www.flickr.com/photos/ableman/2171326385 http://www.flickr.com/photos/wetwebwork/2847766967 http://www.flickr.com/photos/chanc/1389379381 http://www.flickr.com/photos/sfllaw/380959080 http://www.flickr.com/photos/kables/9874584 http://www.flickr.com/photos/stuckincustoms/217440037 http://www.flickr.com/photos/z287marc/3189567558 http://www.flickr.com/photos/75491103@N00/3565601486 http://www.flickr.com/photos/runnerone/1487082237 I’d love to hear your questions, answers and comments. So if you have any, please get in touch. Thanks for reading!