February 18, 2020


The serial numbers of NSA reports

(Updated: February 16, 2020)

On January 14, the NSA disclosed a serious vulnerability in the CryptoAPI service of the Windows 10 operating system (vulnerability identifier: CVE-2020-0601). In a rare public Cybersecurity Advisory the agency even offered further details about this issue.

An interesting detail is that this Cybersecurity Advisory has two serial numbers in the same format as the NSA uses on their Top Secret intelligence reports, some of which have been published by Wikileaks and as part of the Snowden-leaks.

The serial numbers on the NSA's Cybersecurity Advisory from January 14, 2020

The NSA's Cybersecurity Advisory has three groups of letters and numbers, the last one being the date of the document in the format month/day/year, which is typical for the United States.

The first group seems to be an external serial number, while the second group is more like an internal serial number. Below, the components of both serial numbers will be discussed in detail.

External serial number

The first serial number on the public Cybersecurity Advisory is similar to the serial numbers on a range of highly classified intelligence reports which were published by Wikileaks in June and July 2015 and in February 2016. These documents were not attributed to Edward Snowden, so they were probably provided by a still unknown "second source".

These intelligence reports were part of various editions of the "Global SIGINT Highlights - Executive Edition" briefings. Wikileaks published only one report in the original layout with header and a disclaimer. In the bottom right corner they have one or two serial numbers, one number for each source of intelligence:

NSA intelligence report about an intercepted conversation between French president
François Hollande and prime minister Jean-Marc Ayrault, May 22, 2012.
(Watermarked by Wikileaks - Click to enlarge)

The serial numbers are followed by a timestamp in the standard military notation: for example, 161711Z stands for the 16th day, 17 hours and 11 minutes ZULU (= Greenwich Mean) Time, with the month and the year as mentioned in the briefing.

The first five intelligence reports published by Wikileaks were from 2006 to 2012 and have the following serial numbers:

These kind of briefings are called serialized reports, which are described in the NSA SIGINT Reporter's Style and Usage Manual as "The primary means by which we provide foreign intelligence information to intelligence users, most of whom are not part of the SIGINT community. A report can be in electrical, hard-copy, video, or digital form, depending on the information's nature and perishability."

The NSA Style Manual also explains the serial numbers of these reports: "Serial numbers are assigned to NSA reports on a one-up annual basis according to the PDDG issuing the report. Every serial includes the classification level, the PDDG of the originator, and a one-up annual number, as in the following examples:

The classification level of a report can be represented by a variety of codes. Comparing the first part of the serial number with the classification marking of a particular report shows that they are assigned according to the following scheme (updated and corrected):

1 = ?
2 = ?
3 = (Top Secret) Comint
 E = ?
G = (Top Secret) Comint-Gamma
I = ?
 S = ?
U = Unclassified
Z = NoForn

The Producer Designator Digraph (PDDG) consists of a combination of two letters and/or numbers and designates a particular "collector". These codes refer to NSA collection facilities and programs, but those with double vowels stand for the signals intelligence agencies of the Five Eyes partnership, as was already revealed in Nicky Hager's book Secret Power from 1996:
AA = GCHQ, United Kingdom
EE = DSD, now ASD, Australia
II = GCSB, New Zealand
OO = NSA, United States
UU = CSE, Canada

The one-up annual number doesn't seem like a continuous number for each year: on the Windows vulnerability report the one-up number is 104201, which would mean that the NSA produced already over one hundred thousand reports in the first two weeks of 2020 alone. That's not realistic, so maybe there are number ranges assigned to each producer or something similar.

Finally, the year in which the report was issued is represented by its last two digits.

Internal serial number

The second series of letters and numbers on the NSA's Cybersecurity Advisory seems to be an internal serial number. In this case it's PP-19-0031, a format that we also saw on the draft of the famous NSA Inspector General's report about the STELLARWIND program, which was leaked by Edward Snowden. This draft report is dated March 24, 2009 and has the serial number ST-09-0002:

Comparing these two serial numbers indicate that the two digits in the middle represent the year and the last four digits are most likely a one-up annual number. The first two letters may be an internal code for the producer: the office, bureau or unit that prepared and issued the report.

This two-letter code doesn't correspond to the PDDG and also not to NSA's organizational designators, which has D1 for the Office of the Inspector General, so there must be another, unknown system for these codes.


After this comparative analysis it has become clear that the serial numbers (and the date) of the NSA's Cybersecurity Advisory can be explained as follows:

by P/K (noreply@blogger.com)

February 15, 2020

n-gate.com. we can't both be right.

webshit weekly

An annotated digest of the top "Hacker" "News" posts for the second week of February, 2020.

My productivity app for the past 12 years has been a single .txt file
February 08, 2020 (comments)
An academic takes notes in a single text file. And a calendar server. And an email server. Hackernews also uses text files, email servers, and calendars, but they use them better, and they're here to recommend expensive books from several online vendors who offer free shipping. The entire remainder of the comment thread is people suggesting taking notes in a notebook, but are shouted down by the ancient refrain: it's not searchable! Only a computer can organize text.

Linux 5.6 is the most exciting kernel in years
February 09, 2020 (comments)
An Internet is extremely enthusiastic about computer hardware drivers for a living. Hackernews is far more excited that they can now run their computers with a networked filesystem provided by the only company they trust with the task: Microsoft. Another pack of Hackernews are pleased that there is coming support for kernel performance improvements they don't understand and cannot use.

Microsoft begins showing an anti-Firefox ad in the Windows 10 start menu
February 10, 2020 (comments)
Microsoft continues the war against its most entrenched and dangerous threat: a struggling webshit vendor who is hemorrhaging money. Hackernews recalls the isolated incidents when Microsoft engaged in unethical business practices hostile to community developed software. Other Hackernews indicate the infrequent nature of this continual and unending series of miscommunications, overreaches, and/or accidental attacks points to a failure of leadership, as a competent executive team would have reached total monopoly by climbing a staircase made of the corpses of copyright cultists.

How the CIA used Crypto AG encryption devices to spy on countries for decades
February 11, 2020 (comments)
The Washington Post, owned by Jeff Bezos, recounts the story of a digital communications company with close ties to the American government that turned out to be a conduit for the world's secrets directly into United States intelligence agencies. Hackernews suspects that the events of this story are not the only instance of the United States government engaging in such fuckery, but can't seem to decide if that's the worst possible problem there could be or else probably for the best. In the end, Hackernews decides the right action to take is to double-check their AmazonSmile charity is not set to the 'Intelligence and National Security Foundation'.

February 12, 2020 (comments)
A webshit makes javascript about math. Hackernews likes the pretty pictures, but thinks they'd be better if Hackernews made them, or if they were made by Hackernews' favorite almost-identical gear-oriented webshit, or if we could print animated books. Other Hackernews just want to point out how important math is to people who need to use math.

I Add 3-25 Seconds of Latency to Every Site I Visit
February 13, 2020 (comments)
An Internet tries to invent digital Methadone. Hackernews recommends the same procedures for disciplining teenagers over the internet. Half of the comments are about which Python script Hackernews uses to pirate Youtube videos. The other half are struggling with the ontology of gratification. On one hand, spake Hackernews, it cannot be wise to gratify oneself instantly and frequently. On the other hand, quoth Hackernews, it's fine to jerk off constantly until you chafe your genitals to stone and die of dehydration, as long as you can quote economics textbooks to people who are only present to wait for their turn to quote their favorite folksy street philosopher.

18-year-old personal website, built with Frontpage and still updated
February 14, 2020 (comments)
A webshit encrypts their website with some kind of vowel-laden kitchen Latin. Hackernews somberly admonishes us to learn the important lesson of the long-lived website: for your message to survive unto future generations, it must be graven upon naked stone, using Microsoft software. After an interminably tedious discussion about which subset of modern webshit will survive into antiquity (defined here as five or more years from now), one Hackernews applauds the site for scoring 100 on Google's PageSpeed analyzer. Literally nobody complains that the site is linked via unencrypted HTTP. Nobody points out the site is not available over HTTPS at all. Nobody even mentions subjecting themselves to Let's Encrypt. Seems like those are all only absolute necessities when the author is peddling webshit.

by http://n-gate.com/hackernews/2020/02/14/0/

February 14, 2020

Data Knightmare (Italian podcast)

DK 4x22 - Bello come un aeroporto

A riprova che il riconoscimento facciale è una soluzione in cerca di un problema, a Linate ci si potrà imbarcare tramite riconoscimento facciale. A patto di avere *anche* la carta d'imbarco.

by Walter Vannini


US government uses Swiss diplomatic network to communicate with Iran

(Updated: February 12, 2020)

A number of countries are connected to each other through bilateral hotlines in order to prevent misunderstandings and miscommunications in times of severe crisis. But what when there's a crisis between two countries that don't have a hotline?

Such a situation occurred after the United States killed the Iranian general Qassem Soleimani on January 3. Because there's no hotline between these countries, the US government used the Swiss diplomatic network urging Tehran not to escalate the crisis, as was reported by the Wall Street Journal.

The Swiss embassy in Tehran
(photo: FDFA - click to enlarge)

Intermediary since 1980

Already since 1980, when Iranian revolutionaries had seized the US embassy in Tehran and took 52 Americans hostage, the Swiss acted as a messenger (or briefträger) between the American and the Iranian government. The US appointed Switzerland as its "protecting power" in Iran and a special United States Interest Section was established in the Swiss embassy for handling informal contacts.

After the American invasion of Iraq in 2003, Swiss diplomats transmitted messages with Iran to prevent direct clashes and under president Obama Switzerland hosted the talks that led to the Iran nuclear deal from 2015. The Swiss ambassador in Iran regularly visits Washington to explain Iran's politics to officials from the Pentagon, the State Department and US intelligence agencies.

The Swiss embassy in Washington DC
(photo: keystone - click to enlarge)

The Soleimani crisis

Details about the informal contacts between the US and Iran have now been revealed by the Wall Street Journal (WSJ): immediately after Washington confirmed the death of general Soleimani on January 3, the US government sent a first urgent message to Tehran asking not to escalate the situation.

The Swiss ambassador delivered the American message by hand to the Iranian foreign minister Javad Zarif early on Friday morning, as the WSJ learned from US and Swiss officials. But Zarif apparently responded to the message with anger, saying that "[U.S. Secretary of State Mike] Pompeo is a bully" and that "The U.S. is the cause of all the problems."

Two days later, on January 5, Zarif called the Swiss ambassador asking to relay his response to the US government, which appeared more restrained. This was followed by a range of back and forth messages, "far more measured than the fiery rhetoric traded publicly by politicians", which for now seem to have helped prevent a military clash between both countries.

The Swiss diplomatic network

The message from the US government to Iran "arrived on a special encrypted fax machine in a sealed room of the Swiss mission" and the WSJ adds that this "equipment operates on a secure Swiss government network linking its Tehran embassy to the Foreign Ministry in Bern and its embassy in Washington. Only the most senior officials have the key cards needed to use the equipment."

Using the secure diplomatic communications network of a third party is a good option for exchanging sensitive messages between two countries, because it's not always possible to set up a direct communications link. One of the difficulties is that in order to encrypt such communications, both parties have to use the same algorithms and, obviously, countries don't like to share their crypto systems with others.

Similar to an official direct bilateral hotline, the back channel through the Swiss diplomatic network appears to be effective, because both parties "can trust a message will remain confidential, be delivered quickly, and will reach only its intended recipients. Statements passed on the back channel are always precisely phrased, diplomatic, and free of emotion" - according to the WSJ report.

The west wing of the Federal Palace (Bundeshaus) in Bern, Switzerland,
home of the Federal Department of Foreign Affairs (FDFA)
(photo: Mike Lehmann/Wikimedia Commons - click to enlarge)

Swiss crypto manufacturers

Switzerland's neutrality is not only useful for diplomatic negotiations, but was also an advantage for the manufacturers of crypto equipment.

One of the oldest companies was Gretag AG, which goes back to 1947. In 1987 most of its cryptographic business was split off and transferred to Omnisec AG, which had been founded especially for this purpose. However, the civil encryption machines as well as those for the Swiss government were sold to AT&T in 1991. Under a new owner, this product line was renamed to Safenet Data Systems, but the company declined rapidly and was liquidated in 2004.

Most of the business had already been taken over by Omnisec AG, which had become one of the most trusted crypto-companies in the world, selling its voice, fax and data encryptors to governments, armies and intelligence services. But gradually the market for proprietary Swiss encryption technology became smaller and smaller and so the company was dissolved in February 2018.

Another Swiss crypto manufacturer is Crypto AG, which was established in 1952 by Boris Hagelin for the activities of his original Swedish company AB Cryptoteknik. Crypto AG became one of the most famous companies in the crypto business, but there were also several allegations that it cooperated with intelligence agencies like the NSA and the German BND.

On February 11, 2020, American and German news outlets revealed that in 1970, the CIA and the German foreign intelligence agency BND took over the ownership of Crypto AG, which provided them an easier access to the encrypted diplomatic and military communications of over 100 countries (the Swiss government always got the secure versions of Crypto AG's equipment though). In 1993, the BND sold its share to the CIA, which continued to run the company until 2018, when Crypto AG was sold to a Swedish entrepreneur.

The Crypto AG Fax Encryption HC-4221
(photo: Crypto AG brochure)

Swiss fax encryption systems

Both Omnisec AG and Crypto AG produced devices for encrypting fax transmissions. Omnisec had the Omnisec 525 Fax Encryptor, which was available in 2007, while Crypto AG manufactured the Fax Encryption HC-4220, which was succeeded by the HC-4221 and was still available in 2011.

These aren't secure fax machines in the strict sense, but separate crypto devices, which encrypt the signals from a commercial fax machine before being transmitted through the public switched telephone network (PSTN). As can be seen in the photo, the HC-4220 was designed to put the actual fax machine on top of it.

Currently, Crypto AG offers the HC-9300 Crypto Desktop, which is a futuristic looking touchscreen device that performs the encryption of telephone, fax, VoIP and e-mail communications. This device is available at least since 2015 and is approved by the Technical Secretariat of the OPCW to be used for inspections for example.

Maybe the Swiss diplomatic network already uses the HC-9300 to secure its fax messages, but in general, government agencies tend to be rather conservative and stick to older versions, also because new crypto equipment has to undergo rigorous testing before it may be used to protect classified information.

See also:
- The hotline between Washington and the former German capital Bonn
- The Washington-Moscow Hotline

by P/K (noreply@blogger.com)

February 11, 2020

Institute of Network Cultures

Mapping Independent Culture in Zagreb

Last month INC proudly published Sepp Eckenhaussen’s Scenes of Independence: Cultural Ruptures in Zagreb (1991-2019) as the third edition of the Deep Pockets series. It tells the story of ‘independent culture’ in Zagreb in a way that is both theoretical, practical, and personal. Below you can read an excerpt. To get a free download or order a print-on-demand copy of the full book, please visit the publication page.

Zagreb, Rebecca West says, ‘has the endearing characteristic, noticeable in many French towns, of remaining a small town when it is in fact quite large’. She wrote these words in the late 1930s, when Zagreb had just over two hundred thousand inhabitants. By 2019, this number has almost quadrupled. Yet a similar feeling captures me while roaming the city today. It seems like Zagreb is a capital and a village at the same time. It is almost impossible to get lost in the streets, squares and parks squeezed between Mount Medvenica and the Sava River.

Zagreb – between Sava and Medvenica.

According to West, Zagreb’s village-like character is ‘a lovely spiritual victory over urbanization’. A dubious compliment. Within a few years after West’s visit, two hundred thousand refugees of World War II settled in the city, affirming that in Zagreb, too, the force of urbanization is more than capable of bending the laws of spiritual life. It could hardly be said that West was naïve, though. Her Black Lamb and Grey Falcon: A Journey Through Yugoslavia is widely regarded as one of the greatest – some say the most foreseeing – books ever written about Yugoslavia. It describes with great finesse and pointy humor the life of the Balkan peoples during centuries of hardships and the constant political quarrels amongst Serbs, Croats, Slovenes, Bosniaks, Albanians, and Macedonians. If anything, West’s spiritualist interpretation of life in Zagreb hints to the understanding that this city is not so easily readable.

The Britanski Trg market on an average weekday.

Zagreb is a fragmented city; its many neighborhoods seem to be different worlds. Even the city center, which is not very vast, is split up into three parts: one for politics, one for religion, and one for life. The old city, located on a hill and called Gornji Grad (upper town), is the seat of the Croatian government. From here, the county’s rulers have a wide view over the rest of the city and the Pannonian Basin beyond it. Over the past decades, Gornji Grad’s old age and altitude have also made it into a well-visited tourist attraction. As a result, a visitor of Gornji Grad will encounter the strange mix of formal power and touristic entertainment usually reserved for royal palaces. On the slope of the hill stands Zagreb’s magnificent cathedral with its towers in eternal scaffolds, surrounded by the clerical complexes. This is Kaptol. Together Gornji Grad and Kaptol are the epicenter of Croatian political and clerical power, which are deeply intertwined.

A view from Gornji Grad.

At the foot of the hill, Donji Grad (lower town) begins. This part of the city, built in the nineteenth century, is surrounded by Gornji Grad on the north and the Green Horseshoe on all other sides. The Green Horseshoe consists of three boulevards modeled after the Ringstraße in Vienna. Its main elements are leafy parks, botanical gardens, and neo-renaissance pavilions designed to simultaneously impress and relax flaneurs and other passers-by. In Donji Grad, one can find the main shopping street Ilica, restaurants, hotels, a handful of one-room cinemas, the botanical gardens, the train station, and the main square Trg Ban Jelačić – usually simply called Trg. It is here that public everyday life takes place. The fact that Zagreb’s urban life is this concentrated is quite joyful. Hardly a day passes without a random encounter with a friend or colleague.

The equestrian statue of Ban Jelačić.

A few kilometres to the East of the city centre, one finds the green pearl of Zagreb: Maksimir Park. The huge park contains five basins full of turtles, a small zoo, a restaurant pavilion overlooking the tops of the trees, and enough lush green and small pathways to wander around for a full day. No wonder young families, dog-walkers, sunbathers and tourists flock the park whenever the sun comes out. It was built by Zagreb’s bishops in the late 18th century and was the first large public park in South-Eastern Europe. A statement of civilization. The famous Yugoslavian writer Miroslav Krleža wrote about Maksimir in his 1926 Journey to Russia:

Where does Europe begin and Asia end? That is far from easy to define: while the Zagreb cardinals’ and bishops’ Maksimir Park is definitely a piece of Biedermeier Europe, the village of Čulinec below Maksimir Park still slumbers in an old Slavic, archaic condition, with wooden architecture from ages prehistorical, and Čulinec and Banova Jaruga to the southeast are the immediate transition to China and India, snoring all the way to Bombay and distant Port Arthur.

So, this wonderful place of leisure, so progressive at the time of its construction, shows the particular of position Zagreb in cultural discourse; an explosive position on ‘the fault line between civilizations’, as Samuel P. Huntington called it in The Clash of Civilizations?. This strange place on the brink of East and West has for centuries played an important role in the identification of Croatian culture, both from inside and out, and contributed to the Balkan’s reputation as the Powder Keg of Europe. For is it not inevitable that, when cultures so different from one another meet in one place, clashes ensue? Croatia is, in other words, on the frontier of the Culture Wars.*

Around the old city of Zagreb, beyond the comfort of the Viennese boulevards of the lower town and the picturesque alleys of the upper town, socialist-era architecture arises. Walking through the maze of streets and courtyards just south of the city center, a visitor might run into the impressive sight of the Rakete: a complex of three rocket-shaped towers designed by Centar 51 in 1968, which will soon be discovered by photographers with a brutalist fetish. And even further south, cut off from the rest of the city by the river Sava, is Novi Zagreb (New Zagreb). This part of town was built by the order of Marshal Tito to accommodate for a new, socialist urban life. Between the typical socialist high rises, a huge horse racing track was built, a new national library, and, more recently, the Museum of Contemporary Art. But despite these grand public works, social life in Novi Zagreb takes place mainly in the cafés of its malls and on the gigantic flea market Hrelić.

High rises in Novi Zagreb.

In this fragmented city with its many testimonies of a rich and turbulent history lives a unique culture. There is the culture embodied by the stately Viennese-style buildings of the Museum of Modern Art, the Archaeological Museum, the Art Pavilion, and the National Theatre, which take up unmistakably symbolic spaces along the promenades of the Green Horseshoe. But then there is also another, more interesting culture in Zagreb. This other culture emerged after Yugoslavia disintegrated in 1991 and Croatia became an independent state for the first time since the Middle Ages. Insiders refer to it as ‘independent culture’ or ‘non-institutional culture’.

Independent culture is just as present in the city center as the grand institutions, yet not immediately recognizable to the outsider. A stone’s throw away from the Archaeological Museum, hidden away in the arcade of a courtyard there is the small Galerija Nova. Galerija Nova is the only exhibition space in Zagreb structurally presenting art exhibitions which carry solidarity with migrants – a highly sensitive topic in this country on the border of the European Union. One block further still, in the back of another courtyard, Multimedia Institute and Hacklab MAMA is located, the base for Croatian media art and digital culture since the early 2000s. It also happens to be the best library in the fields of new media and commons in Croatia. A few minutes eastward by bike, in the poche Martićeva Street, there is a café which at first glance looks like any other. Once inside, however, it turns out that this place, which is called Booksa, is a hotspot of cultural life, where cultural workers come to drink coffee, meet, work, and read. A few tram stops from Booksa, in the fold line between the railroads and The Westin Zagreb, there is an old pharmaceutical factory. Today, it is a former squat called Medika. Instead of medicines, it now produces punk concerts and glitch art exhibitions.

I could go like this on for a while. Because with every visit to one of these places, one meets people and finds out about other places like it: the experimental dance company BADco., curatorial collective BLOK, news outlet Kulturpunkt, youth culture hub Pogon, anarchist bookshop Što Čitaš?, the old socialist Student Centre, platform organizations Clubture and Right to the city, and Documenta – Centre for Dealing with the Past. Like a distributed web, these organizations permeate the urban tissue of Zagreb. They make up a kind of village-like social system in which most people know each other personally and have often worked together at some point. Independent culture is a scene.

Now, if I’m raising the impression independent culture is either a subculture or a purely local phenomenon, I should correct myself immediately. Independent culture includes well-known, (internationally) established organizations. MAMA’s programs include many Croatian contributors, but also Catherine Malabou, Geert Lovink and Pussy Riot. The institute published the latest book by the French philosopher Jacques Rancière, one of my personal favorites amongst contemporary thinkers. WHW, the curatorial collective which directs Galerija Nova, works with internationally renowned artists like Mladen Stilinović, Sanja Iveković, and David Maljković. They have, moreover, been appointed as director of Kunsthalle Wien in the summer of 2019. These are just some examples to show that, while certainly embedded locally, independent culture is an internationally oriented scene.

It is hard to pinpoint exactly what type of culture is created in independent culture, while the practices of the various organizations in it differ so much. It includes but is not limited to dance, performance art, theatre, visual arts, new and old media, experimental cinema, festivals, education, community work, research, discursive programs, networking, and advocacy. It is clear that independent culture transgresses the boundaries of traditional cultural disciplines. The only general characteristic is that while all of these organizations work with culture, none work within the strict confinements of the art world or artistic production – a characteristic so common that it cannot define a scene. So, what is it that connects the scene, apart from personal relations, a shared urban environment, and the fact everyone in it does ‘something with culture’?

If anything, the organizations within independent culture are united by common political outlook (not to be confused with a political agenda). Their programming embodies a conglomeration of activist discourses leaning to the left of the political spectrum. Amongst other things, they focus on anti-fascism, pacifism, commons activism, feminism and queer activism, decoloniality, and ecological activism. Some would say that Yugonostalgia is rather common in independent culture, others would say that they’re Yugofuturist. In order to be able to have this political agency in the context of Croatia, which is predominantly ruled by right-wing and nationalist forces, the scene is organized separately from the state-funded cultural infrastructure. This shows by approximation what the independence is that ties together Zagreb’s independent cultural scene. Being rooted in grassroots activisms rather than large institutions governed by state and local governments, independent culture claims to work, indeed, independently from the dominant power of the state.

But contradictions abound. From the moment of its emergence in the 1990s, the independent cultural infrastructure depended largely on international philanthropist organizations such as the George Soros Foundation, the Rosa Luxembourg Foundation, and the European Cultural Foundation, as well as for-profit organizations such as the Viennese Erste Bank. Then, since the mid-2000s, international funds have retreated from Croatia, making independent cultural organizations more reliant on state funding, effectively incentivizing them to engage in advocacy, self-institutionalization, and cultural policy-making. It is questionable, then, how independent or non-institutional the independent cultural scene really is at this point. Is it a product of local urgency and grassroots engagement, or neoliberal and neo-imperial phenomena like globalization and cultural entrepreneurship? Is it possible that it is both? And if so, what is the interrelation between these forces?

In its analysis of independent cultures, Scenes of Indepencence is at times sharp and critical. The struggles it speaks of are real, and addressing them can, as I have learned, be sensitive at times. Yet, in the end, my account is always informed by solidarity. I deeply appreciate the existence of the organizations gathered under the umbrella of independent cultures. Sensing the political subjectivity and collectivity of the scene, however fragile, is a relieving and inspiring experience, especially when coming from Amsterdam, a place where neoliberal hegemony is by now so complete that elements of collective resistance are nearly completely absent from the circuits of cultural production.

My goal in writing the book has been to instrumentalize my semi-outside perspective and to create an analysis that makes sense to and is useful for the reader in the local context. At the same time, I reckon that the question of independence (in- and outside of culture) is a globally relevant one. My book, therefore, discusses two different (although not separate) questions: What does independent culture in Zagreb look like to an outsider? And what insights do the struggles in Zagreb’s independent culture provide into the regimes of global neoliberalisms?

Find the whole book here:




*The Balkans have been the battleground of military power struggles between West and East for centuries. The expansion of the Ottoman Empire was put to a halt by Western-European forces in the Balkans in the 16th and 17th century. In the early 20th century, the fall of the Ottoman Empire and the emergence of nation-states solidified the border between the liberal, Christian West and the Islamic East. Forced mass migrations and assimilations of ethnic and religious minorities took place, displacing Ottoman Christians West and Balkan-inhabiting Muslims East of the Bosporus. The friction caused between the ethnically and religiously diverse populations that had inhabited the Balkans for centuries, led to two Balkan Wars in 1912 and 1913. In 1914, by firing the mere couple of gun shots that killed Archduke Franz-Ferdinand of Austria, the Bosnian Gavrilo Princip triggered what was briefly considered the Third Balkan War, but is now known as the First World War. This series of events gained the Balkans their reputation as the ‘Powder Keg of Europe’, a reputation that was reinforced once again in reactions to the Yugoslav Wars in the 1990s. Moreover, the idea of the Balkans as a ‘Powder Keg’ was deepened by the rise of global identity politics heralded by the fall of the Iron Curtain in 1989. In The Clash of Civilizations? (1993), the American historian Samuel S. Huntington argued that after what Francis Fukuyama famously called the ‘end of history’, the ‘great divisions among humankind and the dominating source of conflicts will be cultural. […] The clash of civilizations will dominate global politics. The fault lines between civilizations will be battle lines of the future.’ This theory was utilized, if not designed, to justify the US in upholding the aggressive foreign policy rhetoric it has used throughout the Cold War up to the present day. The argument that the Balkans are on the fault line of civilizations served in this agenda as a justification to keep regarding this area as the place where the West fights off the East.

by Miriam Rasch

February 09, 2020

Andy Wingo

state of the gnunion 2020

Greetings, GNU hackers! This blog post rounds up GNU happenings over 2019. My goal is to celebrate the software we produced over the last year and to help us plan a successful 2020.

Over the past few months I have been discussing project health with a group of GNU maintainers and we were wondering how the project was doing. We had impressions, but little in the way of data. To that end I wrote some scripts to collect dates and versions for all releases made by GNU projects, as far back as data is available.

In 2019, I count 243 releases, from 98 projects. Nice! Notably, on ftp.gnu.org we have the first stable releases from three projects:

GNU Guix
GNU Guix is perhaps the most exciting project in GNU these days. It's a package manager! It's a distribution! It's a container construction tool! It's a package-manager-cum-distribution-cum-container-construction-tool! Hearty congratulations to Guix on their first stable release.
GNU Shepherd
The GNU Daemon Shepherd is a modern dependency-based init service, written in Guile Scheme, and used in Guix. When you install Guix as an operating system, it actually stages Scheme programs from the operating system definition into the Shepherd configuration. So cool!
GNU Backgammon
Version 1.06.002 is not GNU Backgammon's first stable release, but it is the earliest version which is available on ftp.gnu.org. Formerly hosted on the now-defunct gnubg.org, GNU Backgammon is a venerable foe, and uses neural networks since before they were cool. Welcome back, GNU Backgammon!

The total release counts above are slightly above what Mike Gerwitz's scripts count in his "GNU Spotlight", posted on the FSF blog. This could be because in addition to files released on ftp.gnu.org, I also manually collected release dates for most packages that upload their software somewhere other than gnu.org. I don't count alpha.gnu.org releases, and there were a handful of packages for which I wasn't successful at retrieving their release dates. But as a first approximation, it's a relatively complete data set.

I put my scripts in git repository if anyone is interested in playing with the data. Some raw CSV files are there as well.

where we at?

Hair toss, check my nails, baby how you GNUing? Hard to tell!

To get us closer to an answer, I calculated the active package count per year. There can be other definitions, but my reading is that an active package is one that has had a stable release within the preceding 3 calendar years. So for 2019, for example, a GNU package is considered active if it had a stable release in 2017, 2018, or 2019. What I got was a graph that looks like this:

What we see is nothing before 1991 -- surely pointing to lacunae in my data set -- then a more or less linear rise in active package count until 2002, some stuttering growth rising to a peak in 2014 at 208 active packages, and from there a steady decline down to 153 active packages in 2019.

Of course, as a metric, active package count isn't precisely the same as project health; GNU ed is indeed the standard editor but it's not GCC. But we need to look for measurements that indirectly indicate project health and this is what I could come up with.

Looking a little deeper, I tabulated the first and last release date for each GNU package, and then grouped them by year. In this graph, the left blue bars indicate the number of packages making their first recorded release, and the right green bars indicate the number of packages making their last release. Obviously a last release in 2019 indicates an active package, so it's to be expected that we have a spike in green bars on the right.

What this graph indicates is that GNU had an uninterrupted growth phase from its beginning until 2006, with more projects being born than dying. Things are mixed until 2012 or so, and since then we see many more projects making their last release and above all, very few packages "being born".

where we going?

I am not sure exactly what steps GNU should take in the future but I hope that this analysis can be a good conversation-starter. I do have some thoughts but will post in a follow-up. Until then, happy hacking in 2020!

by Andy Wingo

February 08, 2020

n-gate.com. we can't both be right.

webshit weekly

An annotated digest of the top "Hacker" "News" posts for the first week of February, 2020.

Palindrome Day 20200202
February 01, 2020 (comments)
An Internet pretends there are date formats not specified in ISO 8601. Hackernews is still mad that America writes numbers down in the order they're spoken, because human behavior should obviously be driven entirely by some nerd's sense of aesthetics. Half of the comments are Hackernews arguing that whatever date format and postal address system they grew up with is a natural law. The other half agree, but grew up somewhere else.

Google Maps Hacks
February 02, 2020 (comments)
An underemployed computer nerd has too many telephones. Hackernews is dimly aware that they've placed half their lives into the hands of a faceless megacorporation with so little engineering acumen that it can be defeated by a bored person with a child's wagon. While some of them consider this to be idly alarming, most are more interested in reporting similar failures in traffic reporting systems run by advertising agencies. Later, another pack of Hackernews drill down into the real problem with traffic management: having to turn left, which is scary and should be outlawed.

The missing semester of CS education
February 03, 2020 (comments)
Some academics isolate the last remaining degrees of freedom in software development and target them for extermination. The academics report their indoctrination efforts to Hackernews and patrol the comment section. Hackernews is not convinced that computer scientists should know how to use computers at all, much less these specific computer programs. While some Hackernews firmly believe that universities should make this information available to interested students, others insist that it is better for learning institution to focus on fundamental invariants in the computer science field, such as AdSense and AWS.

Google tracks individual users per Chrome installation ID
February 04, 2020 (comments)
A GitHub does not like that Google's software sends information to Google, and expresses this opinion with a passive-aggressive rhetorical question on a GitHub issue, which has the effect of preventing the Google who opened the issue from ever interacting with it again. Hackernews trawls through Google public-relations apocrypha to find a suitable body of text which can dismiss the entire class of concerns expressed in the original comment. The common consensus is that it is not possible to prevent your data from being uploaded to Google, and so it's best to lie down on your Chromebook, open your advertising preferences, and think of GMail.

Wacom tablets track every app you open
February 05, 2020 (comments)
An Internet does not like that Wacom's software sends information to Google, and expresses this opinion with an in-depth blog post detailing how to watch it happen. Hackernews opines that switching to Linux would fix this. Other Hackernews think you can just politely ask your hardware vendor to stop spying on you. Still others try to work out what the precise balance should be between the user having a private life and software developers' God-given right to know everything you do with any electronic device. Nobody suggests that a drawing-tablet manufacturer should not even try to collect user data.

Open-plan offices decrease face-to-face collaboration: study
February 06, 2020 (comments)
Some academics claim that free-range programmers do not produce as well as caged programmers. The entire article provides no information beyond this, but contains dozens of links to other useless articles on the same shitty website, none of which contain additional information. Nowhere is the academic study linked, so Hackernews has to go find it themselves. Instead of discussing, Hackernews just whines about their work environments, or reminisces about that one time they got to work somewhere nice. No technology is discussed.

Python dicts are now ordered
February 07, 2020 (comments)
A webshit has something to say about Python internals, but I couldn't focus on the article, because the first comment on the blog post involves the text "it brings Python on par with PHP," which is such a monumentally alien thought that I think I need medical attention. Hackernews argues about who already knew this, why, and how. Another argument breaks out about whether this is the Correct and Natural approach to data structures, or if it's Completely Wrong and Stupid because of some ridiculous edge case nobody cares about. Most of the complaints are from people who are deeply concerned that (entirely hypothetical) existing code might break in the case its author made extremely specific assumptions about one particular data structure in a programming language directly aimed at people who do not give a shit about these topics.

by http://n-gate.com/hackernews/2020/02/07/0/

February 07, 2020

Andy Wingo

lessons learned from guile, the ancient & spry

Greets, hackfolk!

Like just about every year, last week I took the train up to Brussels for FOSDEM, the messy and wonderful carnival of free software and of those that make it. Mostly I go for the hallway track: to see old friends, catch up, scheme about future plans, and refill my hacker culture reserves.

I usually try to see if I can get a talk or two in, and this year was no exception. First on my mind was the recent release of Guile 3. This was the culmination of a 10-year plan of work and so obviously there are some things to say! But at the same time, I wanted to reflect back a bit and look at the past with a bit of distance.

So in the end, my one talk was two talks. Let's start with the first one. (I'm trying a new thing where I share my talks as blog posts. We'll see how this goes. I know the rendering can be a bit off relative to the slides, but hopefully it's good enough. If you prefer, you can just watch the video instead!)

Celebrating Guile 3

FOSDEM 2020, Brussels

Andy Wingo | wingo@igalia.com

wingolog.org | @andywingo

So yeah let's celebrate! I co-maintain the Guile implementation of Scheme. It's a programming language. Guile 3, in summary, is just Guile, but faster. We added a simple just-in-time compiler as well as a bunch of ahead-of-time optimizations. The result is that it runs faster -- sometimes by a lot!

In the image above you can see Guile 3's performance on a number of microbenchmarks, relative to Guile 2.2, sorted by speedup. The baseline is 1.0x as fast. You can see that besides the first couple microbenchmarks where things are a bit inconclusive (click for full-size image), everything gets faster. Most are at least 2x as fast, and one benchmark is even 32x as fast. (Note the logarithmic scale on the Y axis.)

I only took a look at microbenchmarks at the end of the Guile 3 series; before that, I was mostly going by instinct. It's a relief to find out that in this case, my instincts did align with improvement.

mini-benchmark: eval

 ’(let fib ((n 30))
    (if (< n 2)
        (+ (fib (- n 1)) (fib (- n 2))))))

Guile 1.8: primitive-eval written in C

Guile 2.0+: primitive-eval in Scheme

Taking a look at a more medium-sized benchmark, let's compute the 30th fibonacci number, but using the interpreter instead of compiling the procedure. In Guile 2.0 and up, the interpreter (primitive-eval) is implemented in Scheme, so it's a good test of an important small Scheme program.

Before 2.0, though, primitive-eval was actually implemented in C. This had a number of disadvantages, notably that it prevented tail calls between interpreted and compiled code. When we switched to a Scheme implementation of primitive-eval, we knew we would have a performance hit, but we thought that we would gain it back eventually as the compiler got better.

As you can see, it took a while before the compiler and run-time improved to the point that primitive-eval in Scheme reached the speed of its old hand-tuned C implementation, but for Guile 3, we finally got there. Note again the logarithmic scale on the Y axis.

macro-benchmark: guix

guix build libreoffice ghc-pandoc guix \
  –dry-run --derivation

7% faster

guix system build config.scm \
  –dry-run --derivation

10% faster

Finally, taking a real-world benchmark, the Guix package manager is implemented entirely in Scheme. All ten thousand packages are defined in Scheme, the building scripts are in Scheme, the initial RAM disk is in Scheme -- you get the idea. Guile performance in Guix can have an important effect on user experience. As you can see, Guile 3 lowered elapsed time for some operations by around 10 percent or so. Of course there's a lot of I/O going on in addition to computation, so Guile running twice as fast will rarely make Guix run twice as fast (Amdahl's law and all that).

spry /sprī/

  • adjective: active; lively

So, when I was thinking about words that describe Guile, the word "spry" came to mind.

spry /sprī/

  • adjective: (especially of an old person) active; lively

But actually when I went to look up the meaning of "spry", Collins Dictionary says that it especially applies to the agèd. At first I was a bit offended, but I knew in my heart that the dictionary was right.

Lessons Learned from Guile, the Ancient & Spry

FOSDEM 2020, Brussels

Andy Wingo | wingo@igalia.com

wingolog.org | @andywingo

That leads me into my second talk.

guile is ancient

2010: Rust

2009: Go

2007: Clojure

1995: Ruby

1995: PHP

1995: JavaScript

1993: Guile (33 years before 3.0!)

It's common for a new project to be lively, but Guile is definitely not new. People have been born, raised, and earned doctorates in programming languages in the time that Guile has been around.

built from ancient parts

1991: Python

1990: Haskell

1990: SCM

1989: Bash

1988: Tcl

1988: SIOD

Guile didn't appear out of nothing, though. It was hacked up from the pieces of another Scheme implementation called SCM, which itself was initially based on Scheme in One Defun (SIOD), back before the Berlin Wall fell.

written in an ancient language

1987: Perl

1984: C++

1975: Scheme

1972: C

1958: Lisp

1958: Algol

1954: Fortran

1958: Lisp

1930s: λ-calculus (34 years ago!)

But it goes back further! The Scheme language, of which Guile is an implementation, dates from 1975, before I was born; and you can, if you choose, trace the lines back to the lambda calculus, created in mid-30s as a notation for computation. I suppose at this point I should say mid-2030s, to disambiguate.

The point is, Guile is old! Statistically, most software projects from olden times are now dead. How has Guile managed to survive and (sometimes) thrive? Surely there must be some lesson or other that can be learned here.

ancient & spry

Men make their own history, but they do not make it as they please; they do not make it under self-selected circumstances, but under circumstances existing already, given and transmitted from the past.

The tradition of all dead generations weighs like a nightmare on the brains of the living. [...]

Eighteenth Brumaire of Louis Bonaparte, Marx, 1852

I am no philospher of history, but I know that there are some ways of looking at the past that do not help me understand things. One is the arrow of enlightened progress, in which events exist in a causal chain, each producing the next. It doesn't help me understand the atmosphere, tensions, and possibilities inherent at any particular point. I find the "progress" theory of history to be an extreme form of selection bias.

Much more helpful to me is the Hegelian notion of dialectics: that at an given point in time there are various tensions at work. In our field, an example could be memory safety versus systems programming. These tensions create an environment that favors actions that lead towards resolution of the tensions. It doesn't mean that there's only one way to resolve the tensions, and it's not an automatic process -- people still have to do things. But the tendency is to ratchet history forward to a new set of tensions.

The history of a project, to me, is then a process of dialectic tensions and resolutions. If the project survives, as Guile has, then it should teach us something about the way this process works in practice.

ancient & spry

Languages evolve; how to remain minimal?

Dialectic opposites

  • world and guile

  • stable and active

  • ...

Lessons learned from inside Hegel’s motor of history

One dialectic is the tension between the world's problems and what tools Guile offers to understand and solve them. In 1993, the web didn't really exist. In 2033, if Guile doesn't run well in a web browser, probably it will be dead. But this process operates very slowly, for an old project; Guile isn't built on CORBA or something ephemeral like that, so we don't have very much data here.

The tension between being a stable base for others to build on, and in being a dynamic project that improves and changes, is a key tension that this talk investigates.

In the specific context of Guile, and for the audience of the FOSDEM minimal languages devroom, we should recognize that for a software project, age and minimalism don't necessarily go together. Software gets features over time and becomes bigger. What does it mean for a minimal language to evolve?

hill-climbing is insufficient

Ex: Guile 1.8; Extend vs Embed

One key lesson that I have learned is that the strategy of making only incremental improvements is a recipe for death, in the long term. The natural result is that you reach what you perceive to be the most optimal state of your project. Any change can only make it worse, so you stop moving.

This is what happened to Guile around version 1.8: we had taken the paradigm of the interpreter as language implementation strategy as far as it could go. There were only around 150 commits to Guile in 2007. We were stuck.

users stay unless pushed away

Inertial factor: interface

  • Source (API)

  • Binary (ABI)

  • Embedding (API)

  • CLI

  • ...

Ex: Python 3; local-eval; R6RS syntax; set!, set-car!

So how do we make change, in such a circumstance? You could start a new project, but then you wouldn't have any users. It would be nice to change and keep your users. Fortunately, it turns out that users don't really go away; yes, they trickle out if you don't do anything, but unless you change in an incompatible way, they stay with you, out of inertia.

Inertia is good and bad. It does conflict with minimalism as a principle; if you were to design Scheme in 2020, you would not include mutable variables or even mutable pairs. But they are still with us because if we removed them, we'd break too many users.

Users can even make you add back things that you had removed. In Guile 2.0, we removed the capability to evaluate an expression at run-time within the lexical environment of an expression, as we didn't know how to implement this outside an interpreter. It turns out this was so important to users that we had to add local-eval back to Guile, later in the 2.0 series. (Fortunately we were able to do it in a way that layered on lower-level facilities; this approach reconciled me to the solution.)

you can’t keep all users

What users say: don’t change or remove existing behavior

But: sometimes losing users is OK. Hard to know when, though

No change at all == death

  • Natural result of hill-climbing

Ex: psyntax; BDW-GC mark & finalize; compile-time; Unicode / locales

Unfortunately, the need to change means that sometimes you will lose users. It's either a dead project, or losing users.

In Guile 1.8, for example, the macro expander ran lazily: it would only expand code the first time it ran it. This was good for start-up time, because not all code is evaluated in the course of a simple script. Lazy expansion allowed us to start doing important work sooner. However, this approach caused immense pain to people that wanted "proper" Scheme macros that preserved lexical scoping; the state of the art was to eagerly expand an entire file. So we switched, and at the same time added a notion of compile-time. This compromise kept good start-up time while allowing fancy macros.

But eager expansion was a change. Users that relied on side effects from macro expansion would see them at compile-time instead of run-time. Users of old "defmacros" that could previously splice in live Scheme closures as literals in expanded source could no longer do that. I think it was the right choice but it did lose some users. In fact I just got another bug report related to this 10-year-old change last week.

every interface is a cost

Guile binary ABI: libguile.so; compiled Scheme files

Make compatibility easier: minimize interface

Ex: scm_sym_unquote, GOOPS, Go, Guix

So if you don't want to lose users, don't change any interface. The easiest way to do this is to minimize your interface surface. In Go, for example, they mostly haven't had dynamic-linking problems because that's not a thing they do: all code is statically linked into binaries. Similarly, Guix doesn't define a stable API, because all of its code is maintained in one "monorepo" that can develop in lock-step.

You always have some interfaces, though. For example Guix can't change its command-line interface from one day to the next, for example, because users would complain. But it's been surprising to me the extent to which Guile has interfaces that I didn't consider. Recently for example in the 3.0 release, we unexported some symbols by mistake. Users complained, so we're putting them back in now.

parallel installs for the win

Highly effective pattern for change

  • libguile-2.0.so

  • libguile-3.0.so


Changed ABI is new ABI; it should have a new name

Ex: make-struct/no-tail, GUILE_PKG([2.2]), libtool

So how does one do incompatible change? If "don't" isn't a sufficient answer, then parallel installs is a good strategy. For example in Guile, users don't have to upgrade to 3.0 until they are ready. Guile 2.2 happily installs in parallel with Guile 3.0.

As another small example, there's a function in Guile called make-struct (old doc link), whose first argument is the number of "tail" slots, followed by initializers for all slots (normal and "tail"). This tail feature is weird and I would like to remove it. Unfortunately I can't just remove the argument, so I had to make a new function, make-struct/no-tail, which exists in parallel with the old version that I can't break.

deprecation facilitates migration

__attribute__ ((__deprecated__))
 "(ice-9 mapping) is deprecated."
 "  Use srfi-69 or rnrs hash tables instead.")
  ("Arbiters are deprecated.  "
   "Use mutexes or atomic variables instead.");

begin-deprecated, SCM_ENABLE_DEPRECATED

Fortunately there is a way to encourage users to migrate from old interfaces to new ones: deprecation. In Guile this applies to all of our interfaces (binary, source, etc). If a feature is marked as deprecated, we cause its use to issue a warning, ideally at compile-time when users responsible for the package can fix it. You can even add __attribute__((__deprecated__)) on C types!

the arch-pattern

Replace, Deprecate, Remove

All change is possible; question is only length of deprecation period

Applies to all interfaces

Guile deprecation period generally one stable series

Ex: scm_t_uint8; make-struct; Foreign objects; uniform vectors

Finally, you end up in a situation where you have replaced the old interface and issued deprecation warnings to help users migrate. The next step is to remove the old interface. If you don't do this, you are failing as a project maintainer -- your project becomes literally unmaintainable as it just grows and grows.

This strategy applies to all changes. The deprecation period may last a while, and it may be that the replacement you built doesn't serve the purpose. There is still a dialog with the users that needs to happen. As an example, I made a replacement for the "SMOB" facility in Guile that allows users to define new types, backed by C interfaces. This new "foreign object" facility might not actually be good enough to replace SMOBs; since I haven't formally deprecatd SMOBs, I don't know yet because users are still using the old thing!

change produces a new stable point

Stability within series: only additions

Corollary: dependencies must be at least as stable as you!

  • for your definition of stable

  • social norms help (GNU, semver)

Ex: libtool; unistring; gnulib

In my experience, the old management dictum that "the only constant is change" does not describe software. Guile changes, then it becomes stable for a while. You need an unstable series escape hill-climbing, then once you found your new hill, you start climbing again in the stable series.

Once you reach your stable point, the projects you rely on need to exhibit the same degree of stability that you envision for your project. You can't build a web site that you expect to maintain for 10 years on technology that fundamentally changes every 6 months. But stable dependencies isn't something you can ensure technically; rather it relies on social norms of who makes the software you use.

who can crank the motor of history?

All libraries define languages

Allow user to evolve the language

  • User functionality: modules (Guix)

  • User syntax: macros (yay Scheme)

Guile 1.8 perf created tension

  • incorporate code into Guile

  • large C interface “for speed”

Compiler removed pressure on C ABI

Empowered users need less from you

A dialectic process does not progress on its own: it requires actions. As a project maintainer, some of my actions are because I want to do them. Others are because users want me to do them. The user-driven actions are generally a burden and as a lazy maintainer, I want to minimize them.

Here I think Guile has to a large degree escaped some of the pressures that weigh on other languages, for example Python. Because Scheme allows users to define language features that exist on par with "built-in" features, users don't need my approval or intervention to add (say) new syntax to the language they work in. Furthermore, their work can still compose with the work of others, even if the others don't buy in to their language extensions.

Still, Guile 1.8 did have a dynamic whereby the relatively poor performance of having to run all code through primitive-eval meant that users were pushed towards writing extensions in C. This in turn pushed Guile to expose all of its guts for access from C, which obviously has led to an overbloated C API and ABI. Happily the work on the Scheme compiler has mostly relieved this pressure, and we may therefore be able to trim the size of the C API and ABI over time.

contributions and risk

From maintenance point of view, all interface is legacy

Guile: Sometimes OK to accept user modules when they are more stable than Guile

In-tree users keep you honest

Ex: SSAX, fibers, SRFI

It can be a good strategy to "sediment" solutions to common use cases into Guile itself. This can improve the minimalism of an entire ecosystem of code. The maintenance burden has to be minimal, however; Guile has sometimes adopted experimental code into its repository, and without active maintenance, it soon becomes stale relative to what users and the module maintainers expect.

I would note an interesting effect: pieces of code that were adopted into Guile become a snapshot of the coding style at that time. It's useful to have some in-tree users because it gives you a better idea about how a project is seen from the outside, from a code perspective.

sticky bits

Memory management is an ongoing thorn

Local maximum: Boehm-Demers-Weiser conservative collector

How to get to precise, generational GC?

Not just Guile; e.g. CPython __del__

There are some points that resist change. The stickiest of these is the representation of heap-allocated Scheme objects in C. Guile currently uses a garbage collector that "automatically" finds all live Scheme values on the C stack and in registers. It was the right choice at the time, given our maintenance budget. But to get the next bump in performance, we need to switch to a generational garbage collector. It's hard to do that without a lot of pain to C users, essentially because the C language is too weak to express the patterns that we would need. I don't know how to proceed.

I would note, though, that memory management is a kind of cross-cutting interface, and that it's not just Guile that's having problems changing; I understand PyPy has had a lot of problems regarding changes on when Python destructors get called due to its switch from reference counting to a proper GC.


We are here: stability

And then?

  • Parallel-installability for source languages: #lang

  • Sediment idioms from Racket to evolve Guile user base

Remove myself from “holding the crank”

So where are we going? Nowhere, for the moment; or rather, up the hill. We just released Guile 3.0, so let's just appreciate that for the time being.

But as far as next steps in language evolution, I think in the short term they are essentially to further enable change while further sedimenting good practices into Guile. On the change side, we need parallel installability for entire languages. Racket did a great job facilitating this with #lang and we should just adopt that.

As for sedimentation, we should step back and if any common Guile use patterns built by our users should be include core Guile, and widen our gaze to Racket also. It will take some effort both on a technical perspective but also on a social/emotional consensus about how much change is good and how bold versus conservative to be: putting the dialog into dialectic.

dialectic, boogie woogie woogie



#guile on freenode



Happy hacking!

Hey that was the talk! Hope you enjoyed the writeup. Again, video and slides available on the FOSDEM web site. Happy hacking!

by Andy Wingo

Il Pianista

Sway and the Dock station

I just moved permanently from awesome to Sway because I can barely see any difference. Really.

The whole Wayland ecosystem has improved a LOT since last time I used it. That was last year, as I give Wayland a try once a year since 2016.

However, I had to ditch an useful daemon, dockd. It does automatically disable my laptop screen when I put it in the dock station, but it does relies over xrandr.

What to use then?

ACPI events.

The acpid daemon can be configured to listen to ACPI events and to trigger your custom script. You just have to define which events are you interested in (it does accept wildcards also) and which script acpid should trigger when such events occurs.

I used acpi_listen to catch the events which gets triggered by the physical dock/undock actions:

# acpi_listen
ibm/hotkey LEN0068:00 00000080 00004010
ibm/hotkey LEN0068:00 00000080 00004011

Then, I setup an acpid listener by creating the file /etc/acpi/events/dock with the following content:

action=/etc/acpi/actions/dock.sh %e

This listener will call my script only when an event of type ibm/hotkey occurs, then it tells sway to disable or enable the laptop screen based on the action code. Here’s my dock.sh script:


pid=$(pgrep '^sway$')

if [ -z $pid ]; then
    logger "sway isn't running. Nothing to do"

user=$(ps -o uname= -p $pid)

case "$4" in
    runuser -l $user -c 'SWAYSOCK=/run/user/$(id -u)/sway-ipc.$(id -u).$(pidof sway).sock swaymsg "output LVDS-1 disable"'
    logger "Disabled LVDS-1"
    runuser -l $user -c 'SWAYSOCK=/run/user/$(id -u)/sway-ipc.$(id -u).$(pidof sway).sock swaymsg "output LVDS-1 enable"'
    logger "Enabled LVDS-1"

Don’t forget to make it executable!

chmod +x /etc/acpi/actions/dock.sh

And then start the acpid daemon:

systemctl enable --now acpid

Happy docking!

February 06, 2020

Riccardo Orioles

Un paese (normale) ci vuole

Nord e Sud. Al nord il decennio è finito a Sondrio, ridente cittadina alpestre, dove al locale ospedale i medici non sono riusciti a salvare una bimba di cinque mesi ricoverata d’urgenza. La madre si mette a urlare e a piangere disperatamente. E gli altri ricoverati la scherniscono: “Ma fatela stare zitta!”, “Basta con ‘sti riti tribali!”, “Silenzio, scimmia!”. La povera donna è nigeriana, e quindi insultarla è normale.

Al sud è finito a Melito Porto Salvo, provincia di Reggio Calabria; anzi in una località ignota dove un’adolescente violentata segretamente vive, protetta dalla polizia, per salvarsi dalle minacce e dagli scherni dei compaesani. “Si sono messi tutti con gli stupratori – sussurra il padre – Loro liberi in giro per le strade e noi qui, nascosti”. Il capo degli stupratori è il figlio del boss del paese e questo, nella normalità di Melito, spiega tutto.

Ehi, ma non siamo a febbraio? Che c’entrano ‘ste storie dell’anno scorso? Perché non pensi ai cantanti, al festival e alle dichiarazioni, alle cose di ora? Il fatto è che le cose di ora, nel Paese Felice, durano molto poco. Nel giro di pochi giorni son digerite, entrano nell’abitudine e restano a farne parte tranquillamente, senza che ci si pensi più. Sono “normali”.

La cosa più importante di tutte, nei medioevi – e questo evidentemente è un medioevo – è di segnare pazientemente i confini della “normalità”. Nei paesi civili – ad esempio – gli abitanti si chiamano cittadini, hanno dei diritti, son chiamati col “lei”, votano addirittura ogni tanti anni. Qui, uno su dieci, no: vota solo chi è chiaro di pelle eppure questa elezione – chiaramente incivile – vale lo stesso.

L’elezione appassiona moltissimo questo paese, quasi quanto Sanremo o il campionato. L’ultima volta, guarda un po’, ha messo interesse persino a me; il che è strano, perché io so benissimo che i veri governanti (i Cianci, i Benettoni, i Messina i Denari, e compagnia bella) non li vota nessuno. Però in questo caso si trattava di Bologna, la capitale civile, su cui le colonne fasciste si concentravano ferocemente sghignazzando e urlando. Io – ora posso dirlo – lì a votare ci sarei andato, e avrei votato per chiunque, buono o cattivo, fosse contro l’invasor. L’idea è venuta in mente pure a tanti ragazzi di lassù, che si sono riuniti, si sono scelti un nome (buffo) per distinguersi e hanno cacciato i barbari dalla loro terra. Bello. Il passo successivo è stato di allargarsi fuori, di trovare altre imprese e cose utili da fare e di chiamare gli altri a dare una mano.

Ed è sempre così: si comincia a Torino e si fa il Sessantotto, si comincia a Palermo ed eccoti la Pantera, si comincia a… Basta, queste cose nascono sempre in posti piccoli e per poco e si allargano svelte a tutto e per tutto quanto; la fata Cambiamo-tutto svolazza allegramente sopra di loro e ciò che era normale appare fulmineamente stupido e infelice, e ciò che era strano possibile e normale.

Purtroppo, oltre che le fatine simpatiche e ridenti, nel cielo s’affaccia ben presto anche un puntino nero: s’avvicina, s’abbassa, fa cerchi sempre più stretti sopra la folla dei ragazzi che, intenti alla loro allegria, neanche ci fanno caso.

E’ la Brutta Strega Cattiva, il cui nome è Politikè Politikànt. “Sei un vip! Un lìder! – sussurra alle fanciulle e ai giovinetti – Sei una persona importante! Parla con gli altri vip e lìder! Che aspetti?”. E ahimè gli ingenui (noi speriamo che siano ingenui) giovinetti e fanciulle non cacciano la fattucchiera ma per un istante le danno ascolto.

Questa vita “normale” non è normale affatto. Catania non è più Catania, né Roma e Milano sono Milano e Roma: sono città diverse da quelle che abbiamo vissuto prima. Sono Berlino Est, sono Johannesburg o Norimberga, sono luoghi di Weimar in procinto di farsi Reich. Non per forza così: è ancora possibile uscire da questo film orribile e tornare nel nostro mondo; ma bisogna svegliarsi, stropicciarsi gli occhi, scuotersi dalla “normalità” fasulla, tornare persone reali. “Normalità”, in un posto come Catania San Cristoforo o Verona o Scampia, significa radunare i giovani e gettarli alle fauci dell’autodistruzione e dell’orrore. Chi questa “normalità” non la denuncia, ogni giorno e ogni volta e a gran voce, tradisce quei giovani e aiuta la loro distruzione.

L'articolo Un paese (normale) ci vuole proviene da Il Fatto Quotidiano.

by Riccardo Orioles

Data Knightmare (Italian podcast)

DK 4x21 - Il fenomeno del riconoscimento facciale

Finora l'industria del riconoscimento facciale aveva cercato di salvare almeno l'apparenza della decenza. Poi è arrivato il fenomeno.

by Walter Vannini

February 02, 2020

Museo dell Informatica funzionante

Our historical VAX/VMS systems are back online 24/7!

After some technical difficulties, finally our VAX/VMS systems are back online, under SimH emulation. Thanks to to the combined efforts from Parazyd, Katolaz, Gerry, Mancausoft, Roadstar, Zappolo and the hardware donations by Francesco, Simone, Alessio and Domenico!

We decided to use an emulator to save the real hardware from issues and problems:

GLORGHNORTH, a VAXStation 4000/96, is unique at our Museum, and it’s too risky to keep it up and running for free access.

Background: VAXStation 4000/96. Foreground: MicroVAX 3100/40 RAM testingBackground: VAXStation 4000/96. Foreground: MicroVAX 3100/40 RAM testing

SNORRVIJOIER is a MicroVAX 3100/40 – we have plenty, but one is already dead (motherboard problems), and another one started burning RAM modules. So we decided for SimH emulation.

MicroVAX 3100/40 motherboard testingMicroVAX 3100/40 motherboard testing

Now the two computers are back online in the new simulation form, after a disk image dump and a lot of configuration. As usual, they are connected to our RETRO-DECNET amateur network.

To connect use Telnet protocol:

SNORRVIJOIER: telnet medialab.freaknet.org 17023

GLORGHNORTH: telnet medialab.freaknet.org 17123

Free account is “LUTHER“. (Why‘ LUTHER?)

We also fixed some bug into installed games, we’re installing new games and also we have installed a marvelous GOPHER browser!

Note: they’re online since 2005! 🙂

Have fun with VAX/VMS!

L'articolo Our historical VAX/VMS systems are back online 24/7! sembra essere il primo su Museo dell'Informatica Funzionante.

by admin

February 01, 2020

n-gate.com. we can't both be right.

webshit weekly

An annotated digest of the top "Hacker" "News" posts for the last week of January, 2020.

Procrastination is about managing emotions, not time
January 22, 2020 (comments)
Some academics, via the BBC, correct a misconception nobody had. Hackernews tries to decide if being a lazy piece of shit is heritable or a product of upbringing. The rest of the comments are personal anecdotes about how Hackernews has always been healthy, but really began to excel once they convinced doctors to give them amphetamines.

What happened to Mint?
January 23, 2020 (comments)
The answer to the headline question is "it got bought by a regulatory-capture-enhanced monopoly and ignored." Hackernews has about sixteen thousand stealth-mode startups just about to swoop in and pick up the slack. Each Hackernews who announces such intent is immediately beset by unsolicited advice from armchair bankers. The rest of the comments are recommendations of financial services that almost, but do not, replace Mint.

List of Twitter mute words for your timeline
January 24, 2020 (comments)
An Internet figures out that Twitter's muting system can sometimes block their own advertisement platform. Hackernews complains about a lack of documentation for a list of twenty words to paste into a text box. A Twitter reports that there is nothing users can do to unfuck their timelines. The idea is nice, so the link is upvoted, but there isn't anything to say except "Twitter sucks at the only thing anyone wants them to do" so there isn't a lot happening in the comment section.

Am I Unique?
January 25, 2020 (comments)
Yes, no matter what you do. Hackernews struggles with the age-old conflict of the browser: "why should this program have so much power over my data" versus "I want to do everything I do with computers inside this program." Later, the second-oldest conflict is discussed: "I don't want to be identifiable on the internet" versus "this surveillance company will make my life hell unless I am identifiable."

Access to Wikipedia restored in Turkey after more than two and a half years
January 26, 2020 (comments)
Wikipedia is excited that they are once again available in the Soviet Union. The Erdogan apologists and the free-speech advocates organize a dance-off in the comment threads.

An Update on Bradfitz: Leaving Google
January 27, 2020 (comments)
A Google resigns to reinvent host-based authentication from first principles. Hackernews recognizes the Google's username, and commences to eulogize. No technology is discussed, but open-office plans are still unpopular.

WireGuard is now in Linus' tree
January 28, 2020 (comments)
The Linux kernel assimilates another hundred and fifty thousand lines of code. Hackernews is excited, because this greatly increases the chances that they'll get to use the software the next time they boot Ubuntu in a virtual machine on their Macbooks. The rest of the comments are people translating their VPN configurations into confused narratives.

Congrats! Web scraping is legal! (US precedent)
January 29, 2020 (comments)
The United States Government declares that website owners must use technology to combat scrapers instead of using the United States Government. Hackernews knows this is getting escalated to a higher court, and so feels entitled to expound poorly-constructed legal opinions based on their understanding of contract law.

Not everyone has an internal monologue
January 30, 2020 (comments)
An Internet discovers a method to determine which people are people and which are artifacts of the simulation. Hackernews quickly determines, based on various experiential reports from around the internet, that there is an hierarchy to how people's minds work, and that Hackernews is definitely at the top of that hierarchy, and that amphetamines are the key to staying there. Another set of Hackernews explain to us that some of these reports are incorrect, the reporters are simply misunderstanding their own thoughts, and Hackernews knows better than you do how your mind works.

My Second Year as a Solo Developer
January 31, 2020 (comments)
A webshit constructs a series of tables explaining in excruciating detail the process of losing money by making webshit. Hackernews is absolutely well-qualified to provide advice regarding the loss of money through making webshit, and proceeds to do so in volume and at scale. The top comment comes from the Hackernews Beauty Pageant Bronze Medalist, who provides hundreds of words amounting to zero practical information, then refuses to respond to the blog author.

by http://n-gate.com/hackernews/2020/01/31/0/

January 30, 2020

n-gate.com. we can't both be right.

FOSDEM: more boring shit

Let's take a look at my annotated copy of the FOSDEM 2020 main talk schedule, shall we?


Welcome to FOSDEM 2020
Preëmptive apologia.

The Linux Kernel: We have to finish this thing one day ;)
Solving big problems in small steps for more than two decades

The speaker would like to ascribe credit to Linux for the invention of all of the groundbreaking features that Linux has poorly copied from better operating systems and then conclude that Andy Tanenbaum was right.
Speaker's name anagram: THEME: HE RUINS LOTS

FOSSH - 2000 to 2020 and beyond!
maddog continues to pontificate

We are not treated to a definition of "FOSSH," but we may presume that poorly-supported hardware has joined the ranks of poorly-supported software as a topic of this conference.
Speaker's name anagram: LOGJAM? OLD HAND

FOSDEM@20 - A Celebration
The cliché of constant change

The FOSDEM organizers were unable to recruit a third keynote speaker willing to talk about anything interesting, so instead they tagged someone with twenty years' worth of nothing-better-to-do to rely on a nostalgia filter to make past conferences seem interesting in comparison.
Speaker's name anagram: VETOED OWN SIGN

Closing FOSDEM 2020
Post-facto regret.

Community and Ethics

How FOSS could revolutionize municipal government
with recent real-world examples

Fifty entire minutes from a person who thinks that the copyright license matters at all when operating a city government. The people who write the software used by actual city governments will not be in attendance, because neither Excel nor Powerpoint are available under a free-software license.
Speaker's name anagram: PEON DOES CARE

The Selfish Contributor Explained
The speaker attempts to resolve the cognitive dissonance inspired by the repeated claims of "protecting user freedoms" despite all open-source programmers being employed by massive software companies. No solutions are provided, but some advice is promised regarding the maintenance of license-cult affiliation while conning some webshit company into paying you to mismanage your hobby projects.
Speaker's name anagram: MOTLEY TAME JOBS

The Ethics Behind Your IoT
The speaker is convinced that some of the attendees might not have noticed the six hundred thousand articles from Buzzfeed, Gizmodo, Engadget, and Reader's Digest that personal surveillance furniture is not made with the owner's best interests in mind. This talk consists of @internetofshit's twitter feed on a five-year time delay.
Speaker's name anagram: CAN LOB MY DELL

Freedom and AI: Can Free Software include ethical AI systems?
Exploring the intersection of Free software and AI

Some ostensible programmers have decided they are the proper people to introduce ethics to the latest buzzword factory, based on close consultation with the largest privacy-violation apparatus ever constructed and a hodgepodge of B-team politicians-in-exile.
Speakers' name anagrams: I JUSTLY FROWN and HELL NO, MANIAC

Containers and Security

How Containers and Kubernetes re-defined the GNU/Linux Operating System
A Greybeard's Worst Nightmare

A Red Hat arrives to try to convince other middle managers that this container stuff is totally different and somehow relevant to that 5G thing you keep seeing in TV commercials. It is not different, and it is not relevant to any specific radio communications technology, but Red Hat's not paying for the talk to make sense. Red Hat is paying for the talk to convince ISP operators to buy Openshift licenses.
Speaker's name anagram: NERDLIKE AI

Fixing the Kubernetes clusterfuck
Understanding security from the kernel up

In this roller-coaster of a talk, the speaker will outline the crippling security failures of current IT-industry darling Kubernetes, then reassures us that these problems do have solutions, only to horrify us again by revealing the solutions start with "shitting arbitrary code into a Turing-complete interpreter in Ring 0."
Speaker's name anagram: N.K. SAVIOR

Address Space Isolation in the Linux Kernel
Some programmers think they can work around hardware information leaks in code.
Speakers' name anagrams: LAMEST BYTE MOJO and TAPROOM PIKER

Guix: Unifying provisioning, deployment, and package management in the age of containers
The developer of a package manager nobody asked for, written in a programming language nobody likes, insists that the package manager is in fact a perfect replacement for whatever the hell you've been doing, regardless of what that is.
Speaker's name anagram: I COULD SCOUR TV


LumoSQL - Experiments with SQLite, LMDB and more
SQLite is justly famous, but also has well-known limitations

This talk, awkwardly shoved into the schedule at the last minute, sees the speaker claim that shitty fifty-cent on-device sensors are generating data faster than SQLite can store it. This is obviously horse shit, and the speaker hasn't actually got anything working yet, but the speaker needed some attention, which is the primary factor in whether one is allocated speaking time at FOSDEM.
Speaker's name anagram: SANE HARDER

SECCOMP your PostgreSQL
A PostgreSQL consultant tries to drum up public support for a patchset nobody wanted.
Speaker's name anagram: ENJOY A COW

dqlite: High-availability SQLite
An embeddable, distributed and fault tolerant SQL engine

An Ubuntu gets on the "SQLite, but better" bandwagon. Unlike the previous project, which combined SQLite with some code derived from academia to solve perceived flaws in the wrong layer, this project takes the revolutionary approach of combining SQLite with some code derived from academia to solve perceived flaws in the wrong layer.
Speaker's name anagram: A FAKE ARENA KEY

MySQL Goes to 8!
An Oracle shows up to convince us their also-ran database product is still relevant. The only remaining users of the program in question are people running Movable Type, and those people are using the non-Oracle fork of the program anyway. Still, the speaker has to say something to justify Oracle's payment for the trip. As a result, this talk will consist of the speaker reading the MySQL changelog directly from the commit log.
Speaker's name anagram: LØGS VARY? I'D HIKE

SWIM - Protocol to Build a Cluster
SWIM gossip protocol, its implementation, and improvements

While their government may have lost interest, some Russians are still interested in pretending to participate in Europe. The talk is about taking an academic paper from Cornell in 2002 and shoving it into the source code of a MongoDB clone.
Speaker's name anagram: VOILA! SLY VIPS HALVED.


Open Source Under Attack
How we, the OSI and others can defend it

A Google, a Facebook, and a Linux Foundation (a trade organization representing Google and Facebook) lecture us about how hard we must fight to keep their employers from destroying our access to computer software.
Speakers' name anagrams: SNAZZY? RICH? SICK. and SMALL SIX with HI, MEGA-CLENCH!

Is the Open door closing?
Past 15 years review and a glimpse into the future.

A bullshit peddler has a crystal ball, which would like us to know that all open-source programmers are wasting their time. Time is stressed in the talk description as being a precious resource, not to be squandered. It is greatly to be hoped that readers will heed this admonition and not waste any of their finite time listening to this jackass jabber for an hour.
Speaker's name anagram: LADY'S TOKEN FRIZZ

The core values of software freedom
A bureaucrat would like greasy computer nerds to stop getting angry on the internet about Codes of Conduct, but is too much of a coward to just say that.
Speaker's name anagram: RETHINK RACISM HATS

Why open infrastructure matters
OpenStack is a suite of computer management software which is most recently notable for having missed the boat and imploded at the same time. While the flaming wreckage slowly sinks into the ocean, one of the people responsible for the disaster would like us to know how important it is to support the class of software to which OpenStack belongs, even though anyone who is receptive to that message is already using Kubernetes.
Speaker's name anagram: RARER ICY HERTZ

Why the GPL is great for business
Debunking the current business licensing discussion

As terrified programmers desperately try to squeeze every drop of value from the code they relentlessly shit into GitHub, some conclude that the obvious path to profit is copyright law. A bureaucrat arrives to declare that copyright law can be profitable, but not that way, and out of the thousands and thousands of companies on Earth, successfully names three adherents to the copyright cult which have achieved success. One is no longer an independent company, another has had five owners in the past ten years, and the other was founded as a direct result of the mismanagement of its predecessor, but I'm sure none of that will come up in the talk.
Speaker's name anagram: RANKER FLAK SHTICK

United Nations Technology and Innovation Labs
Open Source isn't just eating the world, it's changing it

Another bureaucrat tries to convince us our tax money isn't being wasted. They're fucking around with blockchains (still!), so nobody will be convinced.
Speaker's name anagram: BACK ON DRAMA

Regaining control of your smartphone with postmarketOS and Maemo Leste
Status of Linux on the smartphone

Having failed to gain any traction in commercial telephony, Maemo is the natural place some programmers would turn when searching for a telephone interface to poorly copy. By combining bad clones of proprietary software, niche Linux distributions with no clear policies or mission, and vaporware hardware from a fly-by-night bad-computer vendor, you can have complete control of a telephone that doesn't exist, wouldn't boot if it did, and couldn't make phone calls if it booted. Rejoice, for your time is at hand.
Speakers' name anagrams: JEWEL, WARN MR. JIB! and ERR: BARB BITS


LibreOffice turns ten and what's next
Lots to learn, and get excited about

The fact that the talk description is beset with typesetting errors is so hilariously on-message for LibreOffice that I can make no further enhancements to the sense of derision this talk should inspire.
Speaker's name anagram: HECKLES A MIME

Over Twenty Years Of Automation
The speaker mysteriously refuses to specify what has been automated for twenty years, but cursory examination of the namedropped software product reveals the truth: yet another half-assed domain-specific language intended to abstract away those pesky computers. Only the speaker knows what systems administrators were doing prior to 1999 (hint: there has been an IEEE standard about it longer than that) and why it wasn't really automation -- much less why "reimplement everything from scratch in a fad language" became the right answer in 2015.
Speaker's name anagram: JAM IN BUSHES

Blender, Coming of Age
18 years of Blender open source projects

A programmer made some software people use and would like you to sit through an origin story.
Speaker's name anagram: NEONATAL ODORS

The Hidden Early History of Unix
The Forgotten history of early Unix

This is the annual "FOSDEM organizers accidentally scheduled an interesting talk" talk, delivered by someone who has put actual effort into communicating information that is not self-aggrandizing or otherwise devoted to corporate marketing.
Speaker's name anagram: HER OWLS RAN

Generation gaps
Another SUSE needed some business justification for the FOSDEM trip; this time it's a combination of "you're doing it wrong" and some mournful rehashing of failed also-rans from recent computing history.
Speaker's name anagram: PRIMEVAL NO


HTTP/3 for everyone
The next generation HTTP is coming

The author of curl, Daniel Stenberg, who wrote curl, is so excited to find a country he's allowed into that he takes a break from being the author of curl to tell us about a new version of HTTP, which addresses HTTP/2's "TCP over TCP" problem by migrating to a "TCP over TCP over UDP" model. By formalizing layer violations directly into the protocol, Google can remove any opportunity to interfere with advertising and surveillance, since the application (specified in the standards as Chrome) communicates directly to AdSense without leaking signs of important personal information exfiltration to hostile third parties, such as the user. Daniel Stenberg, author of curl, is the author of curl, which is written and maintained by Daniel Stenberg.
Speaker's name anagram: ABSENT IN LEDGER

State of the Onion
The Road to Mainstream Adoption and Improved Censorship Circumvention

Some people from the United States Navy's bestselling honeypot promise, as in previous years, new ways of attracting users who are not drug vendors, murder vendors, or furries. None of them will work.
Speaker's name anagram: I RAP UGLIER

Future internet that you can use today

Some academics arrive to tell us that (once again) they have Fixed the Internet, and (once again) it runs on top of the current actually-working internet, and (once again) if you sign up you can communicate with as many as twelve other computers.
Speakers' anagram names: OW! KAMIKAZE SLUTS and A ČASUÁL VIM KOOK


Improving protections against speculative execution side channel
The busiest human being on Earth, responsible for mitigating the weekly-discovered security failures in Intel hardware, gets paid to fly to Europe and show us how to work around his employer's increasingly-broken products by writing more computer programs.
Speaker's anagram name: ADD A TWIST REV

SaBRe: Load-time selective binary rewriting
Some academics have gone to a lot of effort to allow us to interfere with software whose source code we already have. Since there's no immediate use for this technology, and the speaker is somewhat bored, we're going to get a demo.
Speaker's anagram name: NEUTRAL PARANOIAS

The year of the virtual Linux desktop
Someone thinks anyone still cares about VR.
Speaker's name anagram: URBAN LOCK SIZES

Making & Breaking Matrix's E2E encryption
In which we exercise the threat model for Matrix's E2E encrypted decentralised communication

Yet Another IRC Clone reports that they've successfully solved some of the problems they inflicted upon themselves.
Speaker's name anagram: DAMN GHETTO SHOW

by http://n-gate.com/fosdem/2020/01/29/0/

Data Knightmare (Italian podcast)

DK 4x20 - Edtech Agit-prop

Per gentile concessione, il keynote di Audrey Watters (twitter: @AudreyWatters) alla OEB 2019 a Berlino. Voce di S. Ortolani. Originale su http://hackeducation.com/2019/11/28/ed-tech-agitprop

by Walter Vannini

January 29, 2020

Classic Programmer Paintings

“RHEL sysadmins entering the Docker convention...

“RHEL sysadmins entering the Docker convention floor”

oil on canvas



Review of Snowden's book Permanent Record - Part II: At the NSA

(Updated: December 20, 2019)

More than 6 years after the first disclosure of Top Secret documents from the NSA, after numerous video appearances and more than 4000 tweets, Edward Snowden has now written an autobiography. It's titled Permanent Record and was published simultaneously in over 20 countries on September 17.

An extensive discussion of the first half of this book, from Snowden's youth to his jobs at the CIA, is provided in Part I of this review. Here, it's about his time at the NSA, which he accuses of collecting everyone's information and storing it forever. However, the book in no way substantiates these claims, misrepresents the NSA collection programs and fails to justify his massive theft of classified data.


Sysadmin at the NSA in Japan

In August 2009 Snowden moved to Japan for his first job at the NSA. This was yet another a contractor job, as he was hired by Perot Systems (which was taken over by Dell in September 2009) under the Agency Extended Information Systems Services (AXISS) contract of the NSA.

His new workplace was at the NSA's Pacific Technical Center (PTC) at Yokota Air Base, near Tokyo. This facility was opened in 2003 as "the sister organization to the highly successful European Technical Center (ETC), providing essential technical and logistical services to vital cryptologic missions in the Pacific Theater."

Here, Snowden worked as a systems administrator responsible for maintaining the local NSA systems and helping to connect the NSA's systems to those of the CIA. As such he found out that the NSA was far ahead in terms of cyberintelligence, but far behind when it came to cybersecurity:
"In Geneva, we'd had to haul the hard drives out of the computer every night and lock them up in a safe - and what's more, those drives were encrypted. The NSA, by contrast, hardly bothered to encrypt anything."
(p. 166)


In Japan, Snowden noticed that the NSA had no proper backup system: because of limited bandwith, local collection sites often did not send copies back to NSA headquarters. He then engineered an automated backup and storage system, that was initially named EPICSHELTER, but was later renamed into Storage Modernization Plan/Program. (p. 166-168)

This system would constantly scan the files at every NSA facility and only if the agency lacked a copy of it back home would the data be automatically queued for transmission. It's not known how accurate this description is, because no original documents about EPICSHELTER have been published.

It's likely though that the scope of the system was smaller than the book suggests and only handled documents and reports produced by NSA employees, not the data the agency intercepted (in Oliver Stone's biographical thriller the fictional Snowden says that EPICSHELTER was only "collecting our finished intel").

For its intercepted communications, the NSA already had a system with a more or less similar function: XKEYSCORE, which in 2008 consisted of filtering systems at some 150 local collection sites. Analysts instruct these local filters to select data of interest, which are subsequently transferred to the agency's central databases. Data that are not of interest disappear from the system's rolling buffer after around 30 days.

Slide from an NSA presentation about XKEYSCORE
showing its federated query hierarchy
(click to enlarge)

Leaving readers with the impression that EPICSHELTER copied and stored virtually all of the NSA's data, Snowden writes:
"The combination of deduplication and constant improvements in storage technology allowed the agency to store intelligence data for progressively longer periods of time. Just over the course of my career, the agency's goal went from being able to store intelligence for days, to weeks, to months, to five years or more after its collection. By the time of this book's publication, the agency might already be able to store it for decades." (p. 167)
Snowden then claims that it is the NSA's ultimate dream "to store all of the files it has ever collected or produced for perpetuity, and so create a perfect memory. The permanent record." (p. 168)

The Utah Data Center

Given that Permanent Record is the title of the book, one would expect a solid substantiation of this claim, but the only "corpus delicti" that Snowden comes up with is the huge $ 1.2 billion data center that NSA built near Bluffdale, Utah, which was probably reported first in July 2009. (p. 246-247)

Snowden says that within the NSA this data center was initially called "Massive Data Repository" but then renamed to "Mission Data Repository" to sound less creepy. This isn't a unique designation for the Utah complex though, because from other sources we know that the NSA has multiple Mission Data Repository (MDR) cloud platforms.

We can assume that Snowden looked and searched for internal NSA documents about the Utah Data Center (UDC), but either he found nothing, or nothing has been published. Maybe that's because it's simply a big back-up facility for the US Intelligence Community as a whole?

That at least seems a plausible option given its official name of "Intelligence Community Comprehensive National Cybersecurity Initiative Data Center" with the purpose of providing a secure and resilient environment supporting the nation's cyber security.

The only relevant piece from the Snowden trove is a map showing that in Utah one can find the NSA's Utah Language Center and two of the NSA's GHOSTMACHINE (GM) cloud computing platforms, codenamed gmCAVE and gmPEACH. It's not clear though whether this is the situation before or after the opening of the data center.

Slide from a 2012 NSA presentation showing the locations
of the agency's GHOSTMACHINE cloud platforms
(click to enlarge)

Permanent Record?

Contrary to Snowden's claim about a "permanent record", many of the data the NSA collects are actually stored for much shorter periods of time. For the programs where communications from foreign targets are collected inside the United States the maximum retention periods for unevaluated data are:
- PRISM (targeted collection from internet companies): 5 years
- Upstream (targeted collection from backbone cables): 2 years
- Section 215 (bulk collection of domestic telephone metadata): 5 years

It seems there were no clear storage restrictions for data collected outside the US under EO 12333 authority, but examples show that they were not kept very long: the NSA's main database for internet metadata, MARINA, stored data for a year, while the massive data processing system RT-RG used in Iraq and Afghanistan could hold its data initially for not more than a month.

In response to the Snowden disclosures, president Obama issued Presidential Policy Directive 28 (PDD-28) in which he determined that personal information about foreigners shall also "not be retained for more than 5 years".

However, Obama's directive didn't change the policy that encrypted communications may be stored indefinitely, something that was useful in the past when only things of importance were encrypted, but makes less sense nowadays. It's ironical that when Snowden urges us to encrypt our data, that actually means they could be stored much longer than if we don't.

On December 12, 2019, the NSA's Inspector General (IG) published a report about the retention requirements for SIGINT data. Many data have to be deleted after a number of years, but the report found several deficiencies in that process. The IG made 11 recommendations and the NSA agreed to implement all of them.


The limitations on storing data from PRISM, Upstream and Section 215 only became public through the declassification of opinions from the FISA Court as well as from a report from the NSA's Civil Liberties and Privacy Office, both in response to one-sided press reports about these programs.

This means that while he was working at the NSA, Snowden may not have been aware of these limitations and therefore jumped to the conclusion that the agency wanted to store its data as long as possible. But by still not mentioning these limited retention periods in his book, Snowden deliberately misleads his readers.

Snowden's atomic moments

According to Permanent Record, Japan was Snowden's "atomic moment" where he realized that "if my generation didn't intervene the escalation would only continue" and surveillance would become "the ear that always hears, the eye that always sees, a memory that is sleepless and permanent." (p. 184-185)

There were however two moments that raised his suspicions:

1. China's domestic surveillance

The first moment was when the NSA's Pacific Technical Center hosted a conference on China and Snowden had to step in as a replacement by giving a briefing about the intersection between counterintelligence and cyberintelligence. (p. 169)

Preparing his briefing, he read about China's mass surveillance against its own citizens and then suspected that the US government was doing the same, because "if something can be done, it probably will be done, and possibly already has been". (p. 170-171)

But how could such surveillance remain secret in an open society like that of the United States, while even the censoring and monitoring measures from the tightly controlled Chinese society are well known? And what would such domestic surveillance have to do with the NSA, which is a military foreign intelligence agency?

Like more radical privacy activists Snowden seems to assume that intelligence agencies like the NSA and CIA desperately want to spy on their own citizens.* But if the government really wants to do so, there are other and easier options, for instance through the FBI and other law enforcement agencies that have the power to wiretap and access to government and private databases.

Another example of mixing these things up is when Snowden describes that he couldn't tell his girlfriend that his "former coworkers at the NSA could target her for surveillance and read the love poems she texted me." It's hard to believe that Snowden really thought that: if there would have been a reason to monitor her, it would have been done by the FBI, not the NSA. (p. 197)

2. The STELLARWIND report

The second moment that apparently scared Snowden was when he read a very secret report about the President's Surveillance Program (PSP), which was established by president George W. Bush after the attacks of 9/11. It gave the NSA the power to track down foreign terrorists without a warrant from the Foreign Intelligence Surveillance Court (FISC) and was therefore also known as Warrantless Wiretapping.

An unclassified report about the PSP was published in July 2009, which gave Snowden the impression that graver things had been going on than just targeted interception of terrorists. This suspicion sent him searching for the classified report on the President's Surveillance Program, which he only found somewhat later by chance. (p. 174-175)
While being interviewed for The Joe Rogan Experience podcast on October 23, 2019, Snowden said that he found the classified version of the STELLARWIND report only somewhere in 2012. It turned up when he ran some "dirty word searches" to help out the Windows network systems administration team that sat next to him when he was in the Office of Information Sharing at NSA Hawaii (see below).

The report appeared to be in a separate classification compartment under the code name STELLARWIND (STLW) and only because someone in the office of the NSA's Inspector General and who had come to Hawaii had left a draft copy on a lower-security system, it popped up as something that Snowden had to remove and delete. Instead, he read it all the way through. (p. 175)

The first page of the highly classified STELLARWIND report
(click for the full report)

After reading the highly restricted report, Snowden found that "the activities it outlined were so deeply criminal that no government would ever allow it to be released unredacted". (p. 176)

This claim requires an explanation of the STELLARWIND program, which doesn't follow in the book, despite the fact that the classified report is very detailed. It makes clear that the program encompassed 4 components:
- Targeted collection of telephony content
- Targeted collection of internet content
- Bulk collection of domestic telephony metadata
- Bulk collection of domestic internet metadata

This may look massive, but on page 9 of the report NSA director Michael Hayden is cited saying that "NSA would not collect domestic communications". Furthermore it explains that the program was only used to collect communications from:
- Members of al-Qaeda and its affiliates (since October 2001)
- Taliban members in Afghanistan (from October 2001 to January 2002)
- The Iraqi Intelligence Service (from March 2003 to March 2004)

The content of these target's communications was collected by filtering backbone cable traffic using some 11,000 phone numbers and e-mail addresses.* On pages 38 and 39 the report says that the bulk collection of both telephone and internet metadata was also strictly limited to finding unknown conspirators of known members of al-Qaeda.

Between 2004 and 2007, all four components of the STELLARWIND program were moved from the president's authority to that of the FISA Court (FISC), based upon a creative interpretation of the Patriot Act and the new Protect America Act.

According to the original report, STELLARWIND was not used for large-scale monitoring of American citizens,* but that's not something we learn from Permanent Record, which is not only misleading but also fails to account for the reason why Snowden was apparently so upset after reading it.

Security clearance reinvestigation

In September 2010, Edward Snowden left Japan and returned to Maryland, where Dell provided him a new job as a technical solutions consultant for their CIA contract, a job that didn't require a security clearance, because the CIA refused to grant him access to classified information (see Part I of this review).

Around that time, Snowden was also due for a periodic background reinvestigation, but when the review was completed in May 2011, no derogatory information had been found. According to the HPSCI-report this was because the investigation was incomplete as, for example, it "never attempted to verify Snowden's CIA employment or speak to his CIA supervisors".

Not much later, Snowden was diagnosed with epilepsy after which he took a four-month disability leave from work until January 2012. According to his memoir, he decided "to start over" and take a less stressful job in Hawaii where the climate and more relaxed lifestyle was better to prevent epileptic seizures. (p. 215)

Did Snowden, who clearly didn't fit into a government bureaucracy, ever considered a private sector job in Silicon Valley, where there's an equally nice climate? Or was he determined enough to find out more about mass surveillance to stay inside the Intelligence Community, although not yet ready to sacrifice everything for that goal? (p. 215)

Sysadmin at the NSA in Hawaii

By the end of March 2012, Snowden and his girlfriend had moved to Hawaii, where he got a new job for Dell at the NSA's regional Cryptologic Center.

While most NSA employees had moved to a new building in the beginning of 2012, Snowden and other technical support workers remained in the so-called Kunia Tunnel, a three story underground bunker facility originally built for aircraft assembly during World War II.

Here, he worked for exactly one year, until March 2013, as a SharePoint systems administrator and the sole employee of the Office of Information Sharing. It was "a significant step down the career ladder, with duties I could at this point perform in my sleep." (p. 214)

The tunnel entrance to the former Kunia Regional Security Operations Center
in Hawaii, where Snowden worked from March 2012 to March 2013
(photo: NSA - click to enlarge)


Just like in his first job at CIA headquarters Snowden started with automating his tasks by writing scripts to do the work for him "so as to free up my time for something more interesting." (p. 214)

That more interesting activity is described in what is probably the most important and most surprising revelation of Permanent Record:
"I want to emphasize this: my active searching out of NSA abuses began not with the copying of documents, but with the reading of them. My initial intention was just to confirm the suspicions that I'd first had back in 2009 in Tokyo. Three years later I was determined to find out if an American system of mass surveillance existed and, if it did, how it functioned." (p. 215)

Here, Snowden basically admits that he isn't a whistleblower: he wasn't confronted with illegal activities or significant abuses and subsequently collected evidence of that, but acted the other way around by gathering as much information he could get, only based upon a vague and, as we have seen, rather far-fetched suspicion.

Snowden also doesn't share whether he found any concrete misconducts in those numerous files, things that could have triggered his decision to hand them over to journalists. He even omits almost all the disclosures made by the press, which makes that Permanent Record contains hardly anything that justifies his unprecedented data theft.

E-mail from Snowden as systems administrator in Hawaii, August 2012
Declassified by the NSA in June 2016
(Click to enlarge)

Readboards and Heartbeat

While his colleagues at the Kunia Tunnel watched Fox News, Snowden's quest for information started with reading what he calls "readboards", a kind of digital bulletin boards where each NSA site posted news and updates. (p. 220)

He started hoarding documents from all these readboards, creating an archive of everything he thought was interesting. After a complaint about exceeding his storage quotum, Snowden came up with the idea to share his personal collection with his colleagues, as a justification, or "the perfect cover", for collecting material from more and more sources. (p. 221, 256)

He then got approval from his boss to create an automated readboard that would perpetually scan for new and unique documents, not only from NSAnet, but also from the networks of the CIA, the FBI as well as from JWICS, the high-level Defense Department intelligence network. (p. 221)

Instead of only gathering titles and metadata like common RSS-readers do, the system had to pull in full documents so NSA Hawaii would have access to all the necessary information in case the fiber-optic cable that connected it with NSA headquarters would be disconnected as a result of a power outage or a cyber attack.

Snowden called the new system Heartbeat (not in capitals in the book) because "it took the pulse" of the NSA and of the wider Intelligence Community (IC), but the program was also important for another reason: "Nearly all of the documents that I later disclosed to journalists came to me through Heartbeat." (p. 221-222)

Mock-up of the Heartbeat interface in Oliver Stone's biographical thriller Snowden
(screenshot from Snowden - click to enlarge)

Scraping tools and stolen passwords

The HPSCI-report says Snowden started his mass downloading of NSA data somewhere around August 1, 2012, using two common scraping tools, called DownThemAll! and wget. These tools were available for legitimate system administrator purposes, but Snowden used them to scrape "all information from internal NSA networks and classified webpages of other IC elements."

This is followed by two redacted sections, so it's not known whether the report acknowledges that this scraping effort was part of an authorized program named Heartbeat. Snowden doesn't mention the scraping tools in his book, but in a video appearance on August 20, 2019, he admitted that he "wrote some scrapers".

Besides the bulk downloading, the HPSCI-report says that Snowden used "his systems administrator privileges to search across other NSA employees' personal network drives and copy what he found on their drives". He also searched for "files related to the promotion and hiring decisions" on the personal network drives of people who had been involved in decisions about jobs for which Snowden had applied.

Already in November 2013, Reuters reported that Snowden even persuaded maybe up to 25 fellow workers to give him their logins and passwords, but in a live chat in January 2014, Snowden vehemently denied this: "I never stole any passwords, nor did I trick an army of co-workers".

The HPSCI-report from 2016 confirmed Reuters' reporting and says that Snowden asked "several of his co-workers for their security credentials so he could obtain information that they could access, but he could not. One of these co-workers subsequently lost his security clearance and resigned from NSA employment."

One would expect that Permanent Record addresses these specific and quite serious accusations, but they are completely ignored. In more general terms however, the book confirms Snowden's almost insatiable desire for information regardless of whether he was entitled to it - he almost seems proud of how easy he could circumvent auditing controls and internal monitoring systems like MIDNIGHTRIDER. (p. 256)

"Collect it All"

While almost "every journalist who later reported on the disclosures was primarily concerned with the targets of surveillance", like American citizens or foreign leaders, Snowden's own curiosity was of technical nature: "the better you can understand a program's mechanics, the better you can understand its potential for abuse." (p. 222)

While Glenn Greenwald saw the slide below as evidence that NSA really wants to "Collect it All", Snowden now says that this was "just PR speak, marketing jargon" intended to impress America's Five Eyes partners and therefore gave him "no insight into how exactly that ambition was realized in technological terms." (p. 222-224)

Slide from a presentation about satellite collection capabilities
at Menwith Hill Station in the United Kingdom, 2011

Given how keen Snowden was to find out the inner workings of the NSA's collection systems, surprisingly little detail about them is found in his book. For example, the best-known and most controversial programs, Section 215 and PRISM, are addressed in only one paragraph each. (p. 222-223)

Just as little information is provided about other NSA collection programs - apparently because such details would undermine Snowden's repetitive claim that the NSA tries to collect everyone's data to store them forever. For example:

- Bulk collection of domestic telephone metadata under Section 215 was limited to counter-terrorism investigations and only used for contact-chaining with no more than 288 seed numbers in 2012, resulting in 6000 numbers that analysts actually looked at.

- Targeted collection from internet companies under PRISM doesn't allow "direct access" to the servers of the companies, has multiple layers of oversight and was used against roughly 160,000 specific foreign targets in 2018.


The most detailed, but still rather limited description in Permanent Record is that of the technologies behind Upstream collection, which is the interception of foreign communications at backbone cables and switching facilities. Snowden says that if you want to look something up on the internet, it has to pass "through TURBULENCE, one of the NSA's most powerful weapons." (p. 225)

According to an internal NSA dictionary, TURBULENCE isn't so much a weapon, but a "framework of mission modernization". A detailed explanation of this framework on the weblog of Robert Sesek shows that it has nine different components, including TURMOIL and TURBINE, which also feature in Snowden's book:

TURMOIL is installed at many locations around the world and makes a copy of a data stream based upon selectors like e-mail addresses, credit card or phone numbers, etc. Suspicious traffic is then tipped over to TURBINE, which uses algorithms to decide whether computer exploits should be used against certain kinds of web traffic. Then, TURBINE injects the exploits in the web traffic back to the target's computer: "Your entire digital life now belongs to them". (p. 225-226)

Snowden claims that these systems "are the most invasive elements of NSA's mass surveillance system, if only because they're the closest to the user." But as TURMOIL filters communications traffic for data that match specific selectors, this qualifies as targeted collection, which is generally preferred above indiscriminate bulk collection.

It's only because Snowden has the habit of describing all the NSA's collection efforts as if they are directed against everyone and anyone ("your traffic", "your digital life") that even targeted collection sounds very scary, but as long as you're not a target, these exploits won't find their way to your computer.

A slide from an unpublished NSA presentation about the TUMULT component of
the TURBULENCE program as seen in the documentary film Citizenfour
(screenshot by paulmd - click to enlarge)

Exfiltrating the data

In his memoir, Snowden says that the big decisions in (his) life are made subconscious and only expressed themselves once fully formed: "once you're finally strong enough to admit to yourself that this is what your conscience has already chosen for you." (p. 214)

Snowden's preparations for leaking to the press apparently started in August 2012, which is earlier than previously assumed. But before handing over his personal collection of Top Secret files, he wanted to "search them and discard the irrelevant and uninteresting, along with those containing legitimate secrets". (p. 256-257)

This was quite difficult on monitored NSA computers, so he took an old Dell PC that he found in a forgotten corner: "Under the guise of compatibility testing, I could transfer the files to these old computers, where I could search, filter, and organize them as much as I wanted, as long as I was careful." (p. 256-257)

It seems that Snowden used this desktop computer as a "thin-on-thick" device, which means that it officially served as a thin client. According to the HPSCI-report Snowden requested such a thin-on-thick computer in late August 2012, which is less than a month after he started bulk downloading internal NSA files.

Careful evaluation?

This set-up allowed Snowden to get "the files I wanted all neatly organized into folders" and later on, he assured that he "carefully evaluated every single document I disclosed to ensure that each was legitimately in the public interest". (p. 258)

Given the huge number of files that he handed over (the book says nothing about their exact number), it's hard to imagine that Snowden was able to evaluate them as careful as he said. In his memoir he already admits how complicated this was:
"Sometimes I'd find a program with a recognizable name, but without an explanation of what it did. Other times I'd just find a nameless explanation, with no indication as to whether the capability it described was an active program or an aspirational desire. I was running up against compartments within compartments, caveats within caveats, suites within suites, programs within programs" (p. 217)

Apparently it was as difficult for Snowden as it was for the journalists to make sense out of these never-before-seen documents, but with the difference that Snowden had less than a year to study them part-time, while a dozen of journalists and their assistants have worked on them for over five years and may still haven't solved all the puzzles.

Even in his hotel room in Hong Kong, in the week before he would meet Greenwald and Poitras, Snowden was sorting his archive, and in order to make it as comprehensive as possible for nontechnical people he also put together dictionairies and glossaries of abbreviations like CCE, CSS, DNI and NOFORN. (p. 288-289)

All these efforts didn't prevent mistakes in the early press reportings, like for example that NSA had "direct access" to the servers of Facebook, Google, and other internet companies. The misinterpretation of the BOUNDLESSINFORMANT slides was another major case that made clear that both Snowden and the journalists lacked enough information about this tool.

When in April 2015, John Oliver expressly asked whether he really had read every single document, Snowden eventually backed down from his original statement saying "Well, I do understand what I turned over" and slowly conceded that his actions carried dangers regardless of his own intentions or competence.

The Rubik's Cube

The next step in exfiltrating the files was getting them out of the Kunia Tunnel complex. Taking pictures with a smartphone wasn't an option, so Snowden decided to copy them onto mini- and micro-SD cards. They have so little metal in them that they will hardly trigger metal detectors, but are extremely slow to write: it can take up to 8 hours to fill a single card. (p. 258-259)

This had to be repeated multiple times and so Snowden sneaked the SD cards past the security checks in different ways: in his sock, in his cheek (so he could swallow it if needed) and at the bottom of his pocket. He doesn't confirm or deny whether he also used a Rubik's Cube to hide an SD card, or that the cube was just used to distract the guards. (p. 259)

Oliver Stone's film Snowden showing how an SD card was hidden in a Rubik's Cube
(screenshot from Snowden - click to enlarge)

At home, Snowden transferred the files from the SD cards to a larger storage device and secured them with multiple layers and different methods of encryption. Altogether, the documents fitted on a single drive, which he left out in the open on his desk at his home, confident that they were protected by the encryption. (p. 262-263)

Handing over the files

On December 1, 2012 Snowden first contacted columnist Glenn Greenwald, but when it proved to be difficult for him to set up an encrypted communications channel, Snowden contacted film maker Laura Poitras on January 13, 2013, after he had received her public key through Micah Lee from the Electronic Frontier Foundation. (p. 250-253)

It's not clear when Snowden sent Poitras the first set of documents that she showed to Greenwald on their flight to Hong Kong.* Eventually, they each received a copy of the full archive when they met Snowden on June 2/3 at his room in the Mira Hotel.

An intriguing story that's not in Permanent Record, but was told in Harper's Magazine from May 2017 is that already on May 10, 2013, Snowden had sent (encrypted) backup copies of the NSA files in postal packages to Jessica Bruder in New York, to Trevor Timm of the Freedom of the Press Foundation, to one person who wants to remain anonymous, and to one unknown person.

In his book, Snowden tries to explain how thoroughly he secured his own archive of NSA documents (through some kind of key distribution scheme), but how about the keys for what was in these packages? And what has happened to the packages?


Infrastructure analyst at the NSA in Hawaii

On March 30, 2013, Edward Snowden had started a new job as an infrastructure analyst for intelligence contractor Booz Allen Hamilton (BAH) at the NSA/CSS Threat Operations Center (NTOC) of NSA Hawaii.

NTOC is a watch center that provides real-time network monitoring and cyber defense capabilities and is located in the NSA's new Joseph J. Rochefort Building (nicknamed "Roach Fort" or "The Roach"), which was officially opened in January 2012.

The Joseph J. Rochefort Building of NSA/CSS Hawaii near Wahiawa in Honolulu
where Snowden worked from mid-April to mid-May 2013.
(still from CBS News - click to enlarge)

There are different versions of the reason why Snowden took this new job. In his memoir he says that after reading about all those NSA programs, systems and tools, his final desire was to see how they were operated by the analysts who take the actual targeting decisions: "Was there anyone this machine could not surveil?" (p. 275-276)

He was especially interested in the XKEYSCORE system, which would later be presented as the NSA's "widest-ranging tool, used to search nearly everything a user does on the Internet". The Booz Allen job as an infrastructure analyst allowed him to work with XKEYSCORE to monitor suspicious activities of hostile cyber actors on the infrastructure of the internet. (p. 277)

Dual-hat authority

Another and more specific reason was given in an interview from June 24, 2013 with the South China Morning Post (SCMP) in which Snowden said that he took the new job because: "My position with Booz Allen Hamilton granted me access to lists of machines all over the world the NSA hacked".

Later, Snowden explained that in his opinion "we’ve crossed lines. We're hacking [Chinese] universities and hospitals and wholly civilian infrastructure rather than actual government targets and military targets." It was to get access to this kind of information that he took the new job, which "gave him rare dual-hat authority covering both domestic and foreign intercept capabilities".

That "dual-hat" also allowed Snowden to find out whether "vast amounts of US communications were being intercepted and stored without a warrant, without any requirement for criminal suspicion, probable cause, or individual designation."

In his new job he continued copying internal NSA documents (maybe he could still use his previous sysadmin priviliges?), but to actually exfiltrate them, he had to return after hours to his old desk with the thin-on-thick computer at the Kunia Tunnel - according to the HPSCI-report.

By-catch conversations

According to Greenwald's book No Place to Hide, Snowden had an even bigger goal in mind when he applied for the job as an infrastructure analyst: the raw surveillance repositories of the NSA. "He took a pay cut to get that job, as it gave him access to download the final set of files he felt he needed to complete the picture of NSA spying."

He succeeded and handed the files over to Barton Gellman from The Washington Post, which in July 2014 reported on these ca. 22,000 collection reports from 2009 to 2012, which contained roughly 160,000 intercepted e-mails and instant-messages. Analysis showed that they came from more than 11,000 accounts, while 9 out of 10 account holders were not the intended targets and nearly half of them Americans.

These online conversations were intercepted through PRISM and Upstream, which is targeted collection, but in Snowden's view it clearly crossed the line of proportionality. In The Post he said that such a "continued storage of data of innocent bystanders in government databases is both troubling and dangerous. Who knows how that information will be used in the future?"

The future danger is largely mitigated by the limited retention period of up to 5 years, but the fact that even this targeted collection leads to such a large amount of by-catch is one of the most problematic aspects of the NSA's operations. Therefore it's puzzling that Snowden doesn't mention this issue at all in his book, especially because The Washington Post's report is not widely known.

Witnessing abuses?

Before starting his new job, Snowden first had to attend a two-week training course at NSA headquarters. There, and during "the short stint I put in at Booz back in Hawaii, were the only times I saw, firsthand, the abuses actually being committed that I'd previously read about in internal documentation." (p. 279)

Here, one expects an explanation of these abuses, but as we will see, Snowden only presents some minor cases in which the NSA's collection system was misused by individual analysts, which doesn't even come close to an organization "in which malfeasance has become so structural as to be a matter not of any particular initiative, but of an ideology" as Snowden puts it. (p. 235)


It's allegedly XKEYSCORE that enables these abuses, but it remains unclear whether Snowden actually has a good understanding of how this system works. At least his descriptions in the book are incomplete and misleading.

He says that by studying the technical specs he found out that XKEYSCORE works "by 'packetizing' and 'sessionizing,' or cutting up the data of a users' online sessions into manageable packets for analysis" - actually, 'sessionizing' means that the small IP packets in which internet communications travel are reassembled into a their original format for further analysis. (p. 278-279)

Diagram showing the dataflow for the DeepDive version of XKEYSCORE

Snowden describes the back end of XKEYSCORE as "an interface that allows you to type in pretty much anyone's address, telephone number, or IP address, and then basically go through the recent history of their online activity." He then says that he would have been able to type in the names of the NSA director or the US president. (p. 279)

He already claimed having such an "authority" in his very first video appearance on June 9, 2013, but afterwards, Glenn Greenwald had to admit that although such searches would not be legally permitted, they were technically possible.

The technical possibilities however are limited too, because in order to retrieve communications via XKEYSCORE, the NSA first has to have physical access to communication links that contain the target's traffic. Therefore it's definitely not the case that "Everyone's communications were in the system" as Snowden says. (p. 279)

What Snowden doesn't tell us is that the actual purpose of XKEYSCORE, and its unique capability, is finding files which are not associated with specific selectors so analysts can trace targets who are using the internet anonymously.

Intimate images

Snowden assumes that none of his new colleagues intended to abuse XKEYSCORE's capabilities, but if they would, then for personal rather than professional reasons. This led to what he calls "the practice known as LOVEINT [...] in which analysts used the agency's programs to surveil their current and former lovers". (p. 280)

It's rather exaggerated to call this a practice because in 2013, NSA Inspector General George Ellard reported that since January 2003, there had been 12 instances of intentional misuse of NSA collection systems. Of these 12 cases, only 8 involved current or past lovers or spouses, most of them foreigners and which were brought to light either through auditing controls or self-reporting.

Apparently more often, male analysts alerted each other of nude photos they found among target communications, "at least as long as there weren't any women around" - which may be one of the reasons that the NSA has adopted a strong diversity policy. (p. 280)

Snowden on the other hand was most touched by "the family stuff" and recalls how he saw a webcam recording of a little boy sitting in the lap of his father, an Indonesian engineer who had applied for a job at a research university in Iran "that was suspected of being related to a nuclear program or a cyberattack" and therefore became of interest to the NSA. (p. 281-282)

As unprofessional as some of his colleagues were by sharing nudes, Snowden seems to have had difficulty to keep a professional distance from his targets. The video with the boy reminded him so much of his own father that he, almost in shock, realized that he would probably never see his family again. (p. 282)

Daniel K. Inouye International Airport in Honolulu, Hawaii
(photo: hellochris/Wikimedia Commons - click to enlarge)

Leaving NSA Hawaii

In the weeks before leaving to Hong Kong, Snowden copied the last set of documents he intended to disclose and tried to decide in which country it would be best to meet Poitras and Greenwald. With Russia and China out of bounds, the elimination process left him with Hong Kong. (p. 283-284)

The final preparations he made "were those of a man about to die". He told his supervisor at Booz Allen that he needed a leave of absence of a couple of weeks for epilepsy treatment on the US mainland and he left his girlfriend a note saying that he was called away for work. (p. 283-284)

Then Snowden packed some luggage, including several thumb drives full of NSA documents, and four laptops: one for secure communications, one for normal communications, a decoy and one that he kept "airgapped". He left his smartphone at home, went to the airport and bought a ticket in cash for the next flight to Tokyo. There, he bought another ticket in cash and arrived in Hong Kong on May 20, 2013. (p. 285)

> To be continued!

Links & sources

- Le Monde: Bug Brother: Pourquoi je préfère la BD sur Snowden à son autobiographie (Dec. 18, 2019)
- Emptywheel: Snowden Needs a Better Public Interest Defense, Part I - Part II (Nov.-Dec. 2019)
- Rolf's Blog: Review of Ed Snowden's "Permanent Record" (Oct. 10, 2019)
- The New York Review of Books: Snowden in the Labyrinth (Oct. 2019)
- Matthew Green: Looking back at the Snowden revelations (Sept. 24, 2019)
- The New Yorker: Edward Snowden and the Rise of Whistle-Blower Culture (Sept. 23, 2019)
- The New Republic: Edward Snowden's Novel Makeover (Sept. 17, 2019)
- Wired: After 6 Years in Exile, Edward Snowden Explains Himself (Sept. 16, 2019)
- The Guardian: Interview by Ewen MacAskill (Sept. 13, 2019)
- Der Spiegel: 'If I Happen to Fall out of a Window, You Can Be Sure I Was Pushed' (Sept. 13, 2019)
- House Permanent Select Committee on Intelligence: Review of the Unauthorized Disclosures of Former National Securitty Agency Contractor Edward Snowden (Sept. 15, 2016)
- Wired: Edward Snowden: The Untold Story (Aug. 2014)
- Vanity Fair: The Snowden Saga: A Shadowland of Secrets and Light (May 2014)

by P/K (noreply@blogger.com)

January 28, 2020

Classic Programmer Paintings

Jenkins in the #dev channel Antoon Claeissens (c.1536 – 1613)

Jenkins in the #dev channel

Antoon Claeissens (c.1536 – 1613)

January 27, 2020

The Society for Social Studies of Science

Labor Out of Place: On the Varieties and Valences of (In)visible Labor in Data-Intensive Science

We apply the concept of invisible labor, as developed by labor scholars over the last forty years, to data-intensive science. Drawing on a fifteen-year corpus of research into multiple domains of data-intensive science, we use a series of ethnographic vignettes to offer a snapshot of the varieties and valences of labor in data-intensive science. We conceptualize data-intensive science as an evolving field and set of practices and highlight parallels between the labor literature and Science and Technology Studies. Further, we note where data-intensive science intersects and overlaps with broader trends in the 21st century economy. In closing, we argue for further research that takes scientific work and labor as its starting point.

by Irene V. Pasquetto (irene_pasquetto@hks.harvard.edu)

Zero Days

I nuovi crimini informatici tra WannaCry e SilkRoad

Carola Frediani, in un recente saggio, illustra il nuovo quadro dei crimini informatici prendendo spunto da tre episodi significativi degli ultimi anni: il caso di WannaCry, l'attacco ai computer dei democratici durante le ultime Presidenziali statunitensi, e gli arresti e indagini attorno ai grandi mercati della droga sul dark web.

Muovendo dalle osservazioni di Carola, in questo podcast ripercorriamo i fatti e cerchiamo di tracciare il minaccioso quadro attuale.

by Giovanni Ziccardi

January 26, 2020

Classic Programmer Paintings

Hieronymus Bosch, The Healing of Madness (Three Consultants...

Hieronymus Bosch, The Healing of Madness (Three Consultants eliciting customer’s requirements), oil on wood, 1475.

January 25, 2020

Classic Programmer Paintings

RMS Introducing GNU in 1983 (According to RMS) Michelangelo c....

RMS Introducing GNU in 1983 (According to RMS)

Michelangelo c. 1512 Fresco 9 ft 2 in × 18 ft 8 in

January 24, 2020

Classic Programmer Paintings

“Developer showing headless server” Jan (Hans) Witdoeck,...

“Developer showing headless server”

Jan (Hans) Witdoeck, Engraving, 1639

The Society for Social Studies of Science

Science and Democracy Reconsidered

To what extent is the normative commitment of STS to the democratization of science a product of the democratic contexts where it is most often produced? STS scholars have historically offered a powerful critical lens through which to understand the social construction of science, and seminal contributions in this area have outlined ways in which citizens have improved both the conduct of science and its outcomes. Yet, with few exceptions, it remains that most STS scholarship has eschewed study of more problematic cases of public engagement of science in rich, supposedly mature Western democracies, as well as examination of science-making in poorer, sometimes non-democratic contexts. How might research on problematic cases and dissimilar political contexts traditionally neglected by STS scholars push the field forward in new ways? This paper responds to themes that came out of papers from two Eastern Sociological Society Presidential Panels on Science and Technology Studies in an Era of Anti-Science. It considers implications of the normative commitment by sociologists working in the STS tradition to the democratization of science.

by Joseph Harris (josephh@bu.edu)

Hidden Injustice and Anti-Science

This essay responds to the five articles on Anti-Science in this journal issue by discussing a significant theme identified across all of them: hidden injustice. Some of the ways that injustice is hidden by organizational forces related to anti-science are identified. In response, the essay points to the need for empirical data on anti-science policies, a symmetric approach to anti-science contexts, and institutional analysis of anti-science power imbalances. Additionally, a reflexive question about whether anti-science analysis in STS leads the field toward racial justice is raised. The essay calls for further organizational level research with a critical STS lens to uncover hidden injustice.

by Laurel Smith-Doerr (lsmithdoerr@soc.umass.edu)

Learning in Crisis: Training Students to Monitor and Address Irresponsible Knowledge Construction by US Federal Agencies under Trump

Immediately after President Trump’s inauguration, US federal science agencies began deleting information about climate change from their websites, triggering alarm among scientists, environmental activists, and journalists about the administration’s attempt to suppress information about climate change and promulgate climate denialism.  The Environmental Data & Governance Initiative (EDGI) was founded in late 2016 to build a multidisciplinary collaboration of scholars and volunteers who could monitor the Trump administration’s dismantling of environmental regulations and science deemed harmful to its industrial and ideological interests.  One of EDGI’s main initiatives has been training activists and volunteers to monitor federal agency websites to identify how the climate-denialist ideology is affecting public debate and science policy.  In this paper, we explain how EDGI’s web-monitoring protocols are being incorporated into college curricula and how, in this way, EDGI’s work aligns with STS work on “critical making” and “making and doing.” EDGI’s work shows how STS scholars can establish new modes of engagement with the state that demand a more transparent and trustworthy relationship with the public, creating spaces where the public can define and demand responsible knowledge practices and participate in the process of creating STS inspired forms of careful, collective, and public knowledge construction.

by Gretchen Gehrke (gehrke.gretchen@gmail.com)

STS Currents against the “Anti-Science” Tide

This essay considers some possible relationships that STS scholars can have with activists who are resisting attacks on environmental science. STS scholars can document the counter-currents to the “anti-science” moment, work in partnership with activists outside of academia, use access to institutional resources to give environmental movements strength, use STS research to help activists better understand the policy process and the history of science funding, and help people to develop a sociological imagination about science and the environment.

by Abby J. Kinchy (Kincha@rpi.edu)

<b>From Sideline to Frontline: STS in the Trump Era</b>

The Trump presidency and its relationship to science and truth have prompted considerable reflection as well as significant action by STS scholars.  Among those thinking, speaking, and acting are the authors of the articles in this thematic collection.  This brief introduction summarizes the major strands in each of the articles, placing them in the context of current political trends.

by Daniel Lee Kleinman (dlkleinman@gmail.com)

Upgraded to Obsolescence: Age Intervention in the Era of Biohacking

Popularized by DIY scientists and quantified-selfers, the language of “biohacking” has become increasingly prevalent in anti-aging discourse. Presented with speculative futures of superhuman health and longevity, consumers and patients are invited to “hack” the aging process, reducing age to one of the many programs, or rather “bugs” that can be re-written, removed, and rendered obsolete. Drawing on recent examples from popular media and anti-aging promotional materials, I explore how the language of biohacking signals an orientation to the body that denies the acceptability of a body that is anything but optimal. In the endless strive towards the latest and greatest, the language of biohacking renders the old body obsolete, standing as nothing more than a relic of an outdated operating system.

by Kirsten L. Ellison (kellison@trentu.ca)

Du Boisian Propaganda, Foucauldian Genealogy, and Antiracism in STS Research

This essay explores the relationships between the “new” anti-science formation under Trump and the kinds of anti-Black racisms we are experiencing at present. What appears at first glance to be a new anti-science formation, isn’t new at all, but old wine in new cloth, all dressed up to confound and distract our gaze from power. The vast majority of Black and Brown people are not surprised nor fooled by Donald Trump and the danger he represents to truth, to our lives, to our precious Earth. For that matter, how are STS scholars working to produce anti-racist knowledge that directly benefits Black people? In this commentary, I briefly respond to these questions by exploring how wildly contrasting accounts of propaganda, truth, and science by W.E.B. Du Bois and Michel Foucault might help STS scholars make sense of the relationship between anti-Black racism and the current anti-science moment in American society.

by Anthony Ryan Hatch (ahatch@wesleyan.edu)

We Have Never Been Anti-Science: Reflections on Science Wars and Post-Truth

This essay addresses the so-called "post-truth" era in which scientific evidence of, for example, climate change, is given little weight compared to more immediate appeals to emotion and belief, and examines the relationship of alleged anti-science and populist irrationality to left- and right-wing political alignments.  It also addresses charges of anti-science that were once leveled at Science and Technology Studies (STS) itself, and particularly in relation to the “symmetrical” posture taken toward scientific controversies.  Recently, "symmetry" in STS has been linked to the media conventions and argumentative strategies that have sustained controversies over climate change and other health and safety concerns.  This essay argues that "symmetry" was originally set up in a circumscribed way to encourage research on controversies, but that it does not amount to a general conclusion to the effect that science is no different from any other system of belief.  Instead, an effort to pursue "symmetrical" research on scientific controversies can document how, far from being displaced from all relevance, scientific authority and its institutional supports are being duplicated along parallel tracks which sustain disputes and delay concerted action. 

by Michael Lynch (mel27@cornell.edu)

Drought, Hurricane, or Wildfire? Assessing the Trump Administration’s Anti-Science Disaster

We describe the Trump Administration as an “anti-science disaster” and approach study of the phenomenon as other disaster researchers might study the impacts of a drought, hurricane, or wildfire. An important, but rare, element of disaster research is identification of baseline data that allow scientific assessment of changes in social and natural systems. We describe three potential baselines for assessing the nature and impact of Trump’s anti-science rhetoric and (in)action on science, science policy, and politics.

by Christopher M. Rea (rea.115@osu.edu)

Institute of Network Cultures

The Data Prevention Manifesto by the Plumbing Birds

The Data Prevention Manifesto by the Plumbing Birds

The privacy discourse sputtered out of steam. This has lead to the current stalemate: we know we’re observed, traced and tracked, but pretend it’s not happening or nothing to fret about. The question is not when the repressed will return but how? Hackers, have been proclaiming that privacy has been dead for decades, that everything can and will be captured, stored and analyzed. And they were right. What’s to be done?

What’s the best way to protect one’s self if not prevent to transmit data in the first place? Effectual hindrance of data coming into being. How to convene a collective dimension of “social networking” without being aggregated in huge data silos extraneous to us, yet profiting on us? 
How can we reclaim autonomy in our everyday life, knowing that there are all these sensors, bots and algorithms are still active? How can these technologies ever be decommissioned?  Are we perhaps waiting for a Great Showdown, a WorldWar, a millennial cyber attack that brings down the entire infrastructure, a bad solar flare or an electronic magnetic pulse knocks out the power grid and erase all hard drives? Or are we about to fall asleep and be mumbed forever, having accepted that everything we do, think and desire, can and will stored, and can be used against us?

We need to de-codify contestation in order to multiply the lines of flight outside of calculated settings. We need to ask the hard questions, too. Do Adblockers, filters, firewalls, close-reading of terms & conditions and online protests of the collection and reselling of private data merely mitigate the problems that are at stake. Or, and perhaps more to the point, what logic does data prevention participate in? Is it, effectively, the same logic it aims to cloak and hide from? Why do we think life can be informationalized? What desire feeds the notion that big data can be transformed into a knowable, manipulated, gamed, anticipated, preempted, capitalized and controlled life? Are we hedging and feeding Unicorns or Frankensteins?

We’re not talking about the weather. Let’s move from protection to the design of a serum. Do not feed the platforms. We’re proposing creative sabotage, concrete forms of prevention that undermine the ‘big data’ regime on all levels, from the molar to the molacular. Let’s take concrete steps towards an overall data reduction. We will no longer feed the data-hungry Minority Report machines that are programmed to identify emerging erratic behavior. Prevention sounds innocent, but make no mistake, it is not. In many cases prevention itself is already seen as a crime. Do we only talk about preventing events from happening, or are we also generating new scenarios? Data prevention is a direct response to top-down smart city technologies. We aim to uphold the preventive-strike (https://en.wikipedia.org/wiki/Preventive_war).

Data prevention is part of a longer history, from native Indian Americans being against having pictures of themselves taken as they will have their soul stolen to punks in the streets of London refusing to have pictures of them taken, back in the ’70s.

Ted Hughes once spoke of the grin hat was trying out faces. Well, that is real. In our case it is fear attempting to become integral. Fear is trying out all human capabilities. Fear can trick us into living a nudged and predictable life. This has come to pass as generation after generation corrupt, spineless, greedy people have been in the drivers seat. We cannot call it leadership. These people, let’s say Bilderberg, Fortune 500 and MBA globally have facilitated anything ‘easy’, outsourcing anything ‘hard’ to places that could be exploited. So now there is no more place to exploit.  Hence the fight for the internal space, the very notion of what it means to be human. If people can not be exploited and enslaved but would wake up already the shame some would feel of having been deceived for so long in what it means to be living, hmm hara kiri would be their only option.  Most of them cannot wake up as there is no longer a self to wake up to. They are zombies. No it is not a coincidence that we have them on our tv. The writers feel they are real. I, too, see them every day. They have been among us for quite some time. I just never thought there would be so many. And we so few.

Before we launch our campaign and gather the many ideas how to design products and services that do not gather data in the first place, it is important to say farewell to the premise that data is the oil of the 21st century. Not only is it questionable that data can and will be financialized, as if this were an inevitable next step that is programmed deep inside the data itself. We should also question the ‘mining’ aspect of the metaphor itself, as if digging up resources is not a devastating environment crime that ruins our planet, from tar sand mines to coal pits and digging for cobalt. Mining comes with a prize. We want to disassociate ourselves from the dark side of financialization of data. Mind the metaphors you use.

Along the same lines we need to get rid of the idea that data traces are things that we ‘leave behind’ in some careless way. It not only legitimates the drag net, but it diverts attention from the rather aggressive techniques that inspect our browsers, networks, and devices. We therefore need to reshape the possibilities of data production.
 This also means we need to stop drawing parallels between computation machines and the human brain, between data and grey matter.

If it is true that the machine only works when all the relevant people are convinced, we need to tell other tales. Convictions are not innocent; they are about re-making worlds. You invest in it.

Let’s stop celebrating the invisible, de-activation, retreat. Let’s quit the visualization of data centers, data points, data pattern, data collection and recognition algorithms. We’re tired of being smeared with how Big Data might smell, feel, look or sound. We do no longer want to play into the game or change its rules.

As Heidegger wrote somewhere, Hebel built the Almanic. When his contemporaries wrote the first novels and poetry of old, he invented this tacky genre of riddles for Tuesday, and a motto for April. And the weather? Well, the weather. In his short text on Hebel, Heidegger writes in 1957:  “We are roaming through a world that lacks a companion (friend of your home, huisvriend). Someone who understands and  appreciates  this technical global frame and the world as home for authentic living both equally strong and loving. We lack such a companion capable of accompanying the predictability and the technicality of nature to the open secret of a renewed experiential naturalness of nature. We lack that companion indeed. But we sense his arrival.”

Data prevention is not a strike, it is only perceived as sabotage by the apparatus needing to be fed by data. We do not believe in safe ways to deal with ‘big data’ that have been collected to monitor, and control, populations. What we prevent here is a conditioned, ultimately boring life that limits itself. Let’s get rid of the guilt to do the forbidden, and then feel the heavy presence of Big Brother, the all-seeing God that will remember every tiny move or bad thought. Let see it as consensual sex without consequences: data prevention creates space for pleasure and possibilities, it is not done to save precious space on our hard disks. Being in the space for possibility is breaking free from the dual pole of production (of data) and paranoia (for the same dynamic).

Let us err. Collectively.

We need to materially engage with the enigmatic, the flawed, the partial, the impure, the surprise, the transgressive, the black swan?

These days prevention is an offensive strategy that questions hidden power relations. It’s not just passive hiding but taking action. In WWII the Germans – in an attempt to confuse the Allied pilots – covered large areas with nets or painted wooden structures. One night one single RAF plane flew over the ‘village’ and dropped one wooden bomb. Let’s prevent this political tech design initiative from ending up in the offline Romanticism section. All actions, gestures, thoughts and movement can and will be captured and caught in the data trap. This can make us depressed, but this fate can also be turned upside down. There was and always will be an abundance of data. Let’s break free from the prospect to lock ourselves up in voluntary monasteries and other tribe-like inward looking social structures. There is no need to save data, let alone recycle them.

Data prevention makes a fresh start and leaves behind the tired discourse. The idea is no longer merely to filter, install blockers and build walls, protecting ultimately instable and open architectures. We create new design principles. Data prevention goes in the offensive. We’re tired having to protect ourselves. Join in this new design movement! Make people aware of what happens, and switch it off.

Engineers are taking us to real-time, thinking that it is an empty space. But we live there in dreamtime. We were ever caught by surprise in the plains. The rain dance went limp. Authorities caught us fishing, labelled aboriginals and slaughtered us filled our minds with cluttering noise and meaningless chatter chatter chatter. Counting on this we would lose the open line. This time this time we occupy that space with our tools and dreams as we live and we eat your tools and ‘logic’.

We, the Platform Plumbers, want you to tell us about your favorite things. Describe them any way you want. We record the richness of your emotions and feelings for your favourite things. We are getting used to doing with less. After all the sensors just measure light, motion, temperature, magnetic fields, gravity, humidity, moisture, vibration, pressure, electrical fields, sound, stretch, glucose level, oxygen level, or osmolality, anything else? Probably a few things but they you can do much better. Don’t you forget that. Practice. Start now and join us in shaping the Dionysian design genre.

Do not be afraid: we will work with you and your structures. We do need to appreciate the ethos of bureaucracy. All your assets, as you call them, belong to us to none no one. You my friend who is evil keep yourself evil still, the four horsemen and the jumper are hurrying down.

So it is up to us to make things find the hard road again. This is not a mass movement. It can only be done with those who are on the path already. If you are, read on. If you are not, don’t feel bad, there are plenty things left to do and who knows you may break through one day.

We will see you when you see us. You’ll see us when we see you.

The Platform Plumbers

(respect for Dowse http://dowse.eu and Fairsky http://fairsky.org)

Platform of the Plumbing Bird

Platform Plumbers 
we know what hides behind walls
and how to drill all the necessary holes
for good and bad
to fix, to change and maintain the pipes
to flood, disrupt, deny the waters.
Responsibility is not control
Awareness is a mercyfull weapon for the wise.
Is better to be aware today (brothers and sisters)
and awake at night as we must
standing for responsible data pregnancy,
and fair data under a fairer sky

Plato holds data (doxa) accountable of confusion and all wrong opinions
and Episteme he calls the field of the wise, that knows how to mark the land.
We don’t need faith into some theoretical dogma
We stay on the side of the platform
-on wich opinions falls as droplets-
and we sum up -upon us- the ability to collet these waters
knowing what to share, to store, to let go.
Because we are willing to use the source wisely
in the time of the drought
for humans, animals and woods alike
Because we share the point of view
of the whole ecosystemical bunch

Hold up to this notion of belonging
hold up to your refined data politeness
over their agendas of hidden data policies
and know what you can then well teach
to hold on before what you can only preach well

They say: money
money -without us- has no economical trust
we can better put trust into money that we bake ourselve up
and get on the path uphill
to “other ideas” that are good. Better.
Virtualised economies
fueled up by remote notions of debt
are obsolete by design,
and going to be worthless

Citizens, conspire to unfullfill the one way exploitation
to grow ecosystems of polite automatic conversations between pairs
instead of feeding motorised pushers of unwanted sleeping pills and snake oil
and  allow these drops to be clean water for all of the living souls on this plain

while we learn, and sleep and love, outside,
the un-attended trap of tracked down, data-silos stored,
overvalued, obviousness works against us
we have no agency on the data, and we should
and the data is not the truth but her shadow, we know
to be used at will by self hypnotised puppetteers
to create fictions that does not compile into histories;
advertising for unwanted goods;
bed time stories for the lust of self loving politicians;
serving the one ideals of “the one percent”
We should have agency on data
because every shadow is a shadow of a body
hit by the light under the sun
we shall be close to these droplets and their sources
and the melting ice in spring
and say no thanks when we must
and please no when we feel is our right

We shall meet again (brothers and sisters)
on the verges of this gorgeous green land
once a desertified mud-bowl
each of us able to speek the language  of choice
to amuse the friends and make the childrens laugh
ad we will be called the platform plumbers
the designers of the garden’s grid
the layers of the pipes, of filters, of the recycling ponds
the choosers of the right seeds to keep
the letter-go of wild grasses and bees
-because no design holds the whole-
we will be remembered as the observers of the waves
the happy carvers of algorithmic stones
that needed to grow into no pyramidal graves

Let the policies of the politeness-poets speek tonight
in, above and below the grid

by Geert Lovink

January 22, 2020

Bretton Woods Project

Recommended resources on the World Bank and IMF 2019


The Case for the Green New Deal
Pettifor, A., Verso, September 2019

Pettifor provides an overview of the macro-economic reforms required to achieve a global Green New Deal; she argues for a ‘steady-state economy’ to drive green investment, displacing the status quo: The globalized, dollarized financial system.



2019 Trade and development report: Financing a Global Green New Deal
UN Conference on Trade and Development, September 2019

UNCTAD’s 2019 report argued that in an effort to establish a Global Green New Deal, a serious discussion of public financing options is first required for governments to reclaim policy space and collectively act to boost demand, to enable the massive new wave of investments required to tackle climate change.

A new multilateralism for shared prosperity: Geneva Principles for a Global Green New Deal
Boston University Center for Global Development Policy Center and UNCTAD, April 2019

This report introduces “a set of ‘Geneva Principles for a Global Green New Deal’ that can form the foundations for a new multilateral trade and investment regime that has shared prosperity and sustainable development as its core goals.”

Banking on Asia: Alignment with the Paris Agreement at six development finance institutions in Asia
E3G, October 2019

This report, “assesses the level of Paris Agreement alignment within the Asian Development Bank, Asian Infrastructure Investment Bank, China Development Bank, Japan International Cooperation Agency, Korea Development Bank and the World Bank. It uses 10 indicators or metrics to assess progress on the various facets of climate action, including mitigation and adaptation, climate risk, greenhouse gas accounting and others.”

Blended finance in the poorest countries: The need for a better approach
Overseas Development Institute, April 2019

This paper casts doubt on the World Bank’s claim that it will leverage private finance from ‘billions to trillions’. It highlights that each $1 invested mobilises on average $0.75 of private finance for developing countries, but this falls to $0.37 for low-income countries.

Burning the gas ‘bridge fuel’ myth: Why gas is not cheap, clean or necessary
Oil Change International, May 2019

The report notes, “The myth of gas as a ‘bridge’ to a stable climate does not stand up to scrutiny. While much of the debate to date has focused on methane leakage, the data shows that the greenhouse gas emissions just from burning the gas itself are enough to overshoot climate goals.”

Climate change and the cost of capital in developing countries
UNEP, Imperial College Business School and SOAS, April 2019

This report “represents the first systematic effort to assess the relationship between climate vulnerability, sovereign credit profiles, and the cost of capital in developing countries.” It shows that over the past ten years, climate change has resulted in an increase of $40 billion in government debt payment in climate vulnerable countries.

Digging deeper: Can the IFC’s green equity strategy help end Indonesia’s dirty coal mines?
Inclusive Development International, Bank Information Center Europe, and Jaringan Advokasi Tambang (JATAM), April 2019

The report notes that, “Recently, the IFC unveiled a series of policies designed to cut off such financing for coal, a development that civil society has welcomed. But how will these commitments be implemented in practice, and how will the IFC and its financial-sector clients address the damage they have contributed to in Indonesia?”

Gender inequality in the tax systems of Arab countries: Egypt, Lebanon, Tunisia and Morocco
Arab NGO Network for Development (ANND) and partners, February 2019

This report analyses the gendered impacts of taxation policy both through explicit and implicit bias in tax policy, with recommendations for progressive reforms that can promote greater gender equality.

Securitization and sustainability: Does it help to achieve the Sustainable Development Goals?
Daniela Gabor, Heinrich Boell Stiftung North America, October 2019

The report argues that, “the new Wall Street Consensus — that re-imagines international development interventions as opportunities for global finance — will not deliver on its promises to deliver sustainability via securitization. The potential gains from organising development interventions around questions of ‘how to sell development finance to the market’ are overstated, while the costs are downplayed.”

Spotlight on financial justice
Citizens for Financial Justice, September 2019

This briefing details the impact of financialisation of essential services through a study of its impact on food and land, health, women’s rights, housing, and infrastructure. The report shows that rising inequalities, and the overexpansion of the finance industry as one of its key contemporary drivers, have been created and reproduced by skewed and unfair rules of the game.  The report makes it clear that local level resistance to financial actors’ penetration is extremely important, but that confronting the drivers of inequality which are now global, such as financialization, requires concerted efforts at higher levels of policy-making as well.

Spotlight on sustainable development 2019: Reshaping governance for sustainability
Reflection Group on the 2030 Agenda for Sustainable Development, July 2019

Four years after the adoption of the 2030 Agenda, the world is off-track to achieve the Sustainable Development Goals (SDGs). This report presents the governance arrangements, structures and institutions that are necessary to implement these policies and to unleash the transformative potential of the SDGs, including the role of the IMF and the World Bank.

Too high a price for the poor and climate? The World Bank’s energy access programme in Myanmar
Bank Information Center Europe, Swedish Society for Nature Conservation, Hivos, and SaNaR (Save the Natural Resource), October 2019

This report provides in-depth analysis of the World Bank’s support for Myanmar’s energy investments, in a country where electricity access rates are at the lowest in Southeast Asia.

UK aid for combatting climate change inquiry
International Development Committee, UK Parliament, May 2019

The Committee report calls on the UK Government to set an example and provide international leadership to realign the focus of aid policy, strategy and funding, to place climate change at the centre — including through its shareholding power in multilateral development banks. It notes that, “The negative consequences of a failure to act will be so serious as to nullify the effectiveness of wider aid spending.”

Uncalculated risks threats and attacks against human rights defenders and the role of development financiers
Coalition for Human Rights in Development, May 2019

This landmark report has exposed the risks of mega-infrastructure and other ill-planned development projects on human rights defenders, and the role of development finance institutions in financing these projects, including the International Finance Institution, the World Bank’s private sector arm.

World Bank financial flows undermine the Paris Climate Agreement
Urgewald, April 2019

This well-documented report details how, despite its rhetoric and recent changes in policies, the World Bank continues to finance fossil fuels. The report demonstrates that fossil fuel investments make up a greater share of the Bank’s current energy lending portfolio than renewable projects. According to the report, the World Bank Group has provided $21 billion in loans, grants, guarantees and equity investments to support fossil fuel projects, and $15 billion for renewable energy projects.



Austerity: The new normal. A renewed Washington Consensus 2010–24
Isabel Ortiz and Matthew Cummings, October 2019

This working paper examines historical and projected government expenditure trends for 189 countries, discussing the negative social impacts of austerity measures ushered in by the renewed Washington Consensus. It calls for urgent action by governments to identify fiscal space to accelerate development, human rights, a green recovery with jobs and inclusive growth, and progress towards the Sustainable Development Goals.

Billions to Trillions – A reality check
Stamp Out Poverty and RE-DEFINE, March 2019

The briefing highlights issues with blended finance, promoted by the World Bank and development banks, particularly how much it realistically mobilises, the limits of scaling up the model and concerns of tied aid.

Can public-private partnerships deliver gender equality?
Eurodad, Femnet and Gender and Development Network, March 2019

This briefing aims to contribute to the growing civil society debate about public-private partnerships, exploring how PPPs can undermine gender equality.

Carbon pricing and the multilateral development banks: Comparative analysis and recommendations
Oil Change International, WWF and E3G, May 2019

The briefing notes, “Several of the Multilateral Developments Banks (MDBs) currently use shadow carbon pricing to informs their decision-making when assessing potential transactions. To promote the alignment of the MDBs with the Paris Agreement, this briefing explores their use of shadow carbon pricing and offers recommendations on best practices.”

Effects of foreign debt and other related international financial obligations of States on the full enjoyment of all human rights, particularly economic, social and cultural rights
UN Office of the High Commissioner of Human Rights, July 2019

This UN report demonstrates that austerity measures imposed by international financial institutions such as the IMF, regularly cause violations of human rights.

External debt vulnerability
Daniela Gabor, University of the West of England, 2019

This short presentation explains the dynamics of external debt vulnerability and sets out the problems with the proposed Maximising Finance for Development which is strongly advocated by the World Bank.

False promises: How delivering education through public-private partnerships risks fuelling inequality instead of achieving quality education for all
Oxfam International, April 2019

The World Bank has been increasingly promoting education public-private partnerships through its lending and advice, despite evidence that shows this often fails the most vulnerable children. This paper recommends that the World Bank and other donors should stop promoting and financing market-oriented education schemes and focus on expanding quality public schooling as a human right for all.

Financing gender inequality: why public-private partnerships leave women worse off
Equality Trust, October 2019

This briefing offers a simple introduction to the issue of public-private partnerships through a gender lens, outlining why strong public services are essential for achieving gender equality, how the rise of public-private partnerships risks undermining this, and offering policy recommendations that seek to meet the needs of women around the world.

Financing injustice
Christian Aid, January 2019

This briefing looks at progress towards the Sustainable Development Goals (SDGs), financing for development, private finance and alternatives, and good investment. It argues for the need for a radically different and rebalanced financial system which ensures that the very poorest are included and actively supported to thrive, and in which developing countries have an equal say in making the rules governing the global economy.

From the Washington consensus to the Wall Street consensus
Heinrich Böll Stiftung, October 2019

This paper reviews the recent initiative led by the G20 countries and their respective development finance institutions (DFIs), including the major multilateral development banks (MDBs), for the financialization of development lending that is based on the stepped-up use of securitization markets. It describes the key elements of the new initiative — specifically how securitization markets work and how the effort is designed to greatly increase the amount of financing available for projects in developing countries by attracting new streams of private investment from private capital markets.

Gender responsive public services and macro-economic policy in Ghana
ActionAid International, April 2019

This briefing paper draws from the findings of an in-depth policy review and fieldwork commissioned by ActionAid’s Young Urban Women’s Programme to explore the ways that IMF-sponsored macro-economic policies have affected the provision of essential public services, with a special focus on Ghana’s health, water and sanitation sectors.

How social protection, public services and infrastructure impact women’s rights
Gender and Development Network and Femnet, January 2019

Ahead of the 63rd UN Commission on the Status of Women, which focused on this topic, this briefing explored how governments can deliver integrated gender-responsive and transformative social protection, public services and infrastructure.

Push no one behind: How current economic policy exacerbates gender inequality
Gender and Development Network, July 2019

This briefing explores Agenda 2030’s “leave no one behind” principle, which does not address the structural barriers that stand in the way of gender equality. It argues that it is not just that women are left behind, but that they are pushed behind by current economic policies including austerity, labour market flexibilization, privatisation and liberalisation of trade and investment rules.

We can work it out. 10 civil society principles for sovereign debt resolution
Eurodad, September 2019

In the face of a new wave of debt crises, and with no systematic process under which sovereign debt restructurings can take place and no possibility for a country to restructure its entire debt stock in one place and in one comprehensive procedure, this briefing from Eurodad makes the case for a debt workout mechanism.

Working towards a just feminist economy: The role of decent work, public services, progressive taxation and corporate accountability in achieving women’s rights
Womankind Worldwide, March 2019

This briefing looks at how to shape the global economy to work for all women through progressive taxation, corporate accountability, gender transformative public services, social protection and decent work.



A global strike for a feminist future of work
openDemocracy, October 2019

The Women’s Global Strike Working Group (ourEconomy, ActionAid, FEMNET, Womankind Worldwide and Fight Inequality Alliance) explores the future of work discussion, especially the role of the World Bank and IMF in their capacity as international financial institutions. It makes recommendations for a feminist future of work, highlighting the global strike on 8 March 2020.

Behind the numbers: Deconstructing the World Bank’s climate-finance claims
Bank Information Center, August 2019

This briefing digs into the reality behind the aggregate numbers of climate finance reported annually by the World Bank Group, providing valuable insights into what this finance is and where it ends up.

Capitalism, postcolonialism and gender: Complicating development
Sara Salem, Gender and Development Network, July 2019

This think-piece explores how both intersectionality and postcolonial feminism expand understanding of gender, race and capitalism in development practice and theory.

Greening the finance trinity
Dileimy Orozco, E3G, December 2019

This blog notes that, “In most countries, the ministry of finance, their central banks and regulators, and the International Monetary Fund (IMF) make up the finance trinity. These are the three institutions with the most power over the economic and financial system. They are critical to the low carbon and resilient transition, playing an important role in ensuring national finance and economics align with the Paris Agreement to limit global warming.”

It’s time to tax for gender justice
Caroline Othim and Roosje Saalbrink, openDemocracy, October 2019

This piece from the co-chairs of the Global Alliance for Tax Justice Tax and Gender Working Group blog discusses tax justice and gender equality, in the context of the United Nations General Assembly convened High-level Dialogue on Financing for Development.

Letter to World Bank president and executive directors calling for more action on climate change
The Big Shift Global, October 2019

This letter from 110 civil society organisations calls on the World Bank to “Phase out lending for all fossil fuels after 2020, including lending for ‘associated facilities’ for fossil fuel projects.”

Letter: Lack of transparency and adequate external stakeholder participation in the IFC/MIGA accountability framework review process
75 civil society organisations, October 2019

Seventy-five civil society organisations wrote to the World Bank board of directors about the upcoming accountability framework review of the International Finance Corporation and Multilateral Investment Guarantee Agency, which includes a review of their independent accountability mechanism, the Compliance Advisor/Ombudsman (CAO). The letter calls on the board to ensure that the accountability framework is informed by individuals and communities who have been affected by IFC/MIGA-supported projects.

Only 1% of gender equality funding is going to women’s organisations – why?
Kasia Staszewska, Tenzin Dolker and Kellea Miller, Association of Women’s Rights in Development (AWID), the Guardian, July 2019

The authors discuss how there’s been a $1bn boost in support in the last two years, but only tiny pots of money are trickling down to feminist groups.

Response to the IMF response on bailing out reckless lenders
Jubilee Debt Campaign UK, October 2019

Jubilee Debt Campaign UK replies to the IMF’s statement in response to JDC research showing that $93 billion of IMF loans are being used to bail out reckless lenders.

The World Bank’s new White Paper falls short on its objective of ‘protecting all’
Development Pathways, October 2019

This blog highlights the Development Pathways and ITUC report response to the World Bank’s White Paper on rethinking social protection systems to extend coverage, which argues that it proposes a “rollback of existing rights and protections for workers, both in terms of social security and labour market protections.”

The post Recommended resources on the World Bank and IMF 2019 appeared first on Bretton Woods Project.

by Isabel Alvarez


Zeit Online zu den Transitzonen

Auf zeit online ist ein exzellent recherchierter und detailreicher Artikel zu den beiden Transitzonen an der serbisch-ungarischen Grenze erschienen. Auch bordermonitoring.eu kommt zu Wort. Die beiden Transitzonen, geschlossene Lager an der ungarisch-serbischen Grenze, sind inzwischen die einzigen Orte in Ungarn, an denen Flüchtlinge und Migranten Asyl beantragen können. Sie stehen seit Langem in der Kritik: … Zeit Online zu den Transitzonen weiterlesen

by ms

Newsletter #1.2020

In der zehnten Ausgabe unseres Newsletters fassen wir die neuesten Entwicklungen des Grenzregimes in Calais und im Ärmelkanal, in Bulgarien, in Griechenland, in Italien, in Ungarn und auf der Balkanroute zusammen.

by ms

n-gate.com. we can't both be right.

webshit weekly

An annotated digest of the top "Hacker" "News" posts for the third week of January, 2020.

Mozilla lays off 70
January 15, 2020 (comments)
Mozilla successfully defends its title as only unprofitable major web browser vendor. The company remains certain that money will begin to flow from its new product, which promises to hide all your web traffic from everyone except Mozilla's primary funding source. Several unemployed Hackernews show up in the comments to tell us they are not angry about being shitcanned with no warning. Another Hackernews thread focuses on the fact that high-paid executives are treated better by the company than people who primarily interact with text editors. Whether this is some kind of class warfare or the Invisible Hand high-fiving the important people comprises the rest of the discussion. The team responsible for the continued functionality of the Beacon API are safe forever.

Unofficial Apple Archive
January 16, 2020 (comments)
Apple's equivalent of SeaOrg catalogues decades of corporate propaganda. Some Hackernews try to decide if Apple made better shit Back In The Goodle Days or if they're just stuck in a nostalgia fog. Other Hackernews gleefully recount their experiences as minor cogs in a massive machine. The rest of the comments are links to other fansites and arguments about which OS X widget theme was correct.

A Sad Day for Rust
January 17, 2020 (comments)
The Rust Evanglism Strike Force regretfully informs us of the excommunication of the author of one of the six Rust programs anyone actually uses. Veterans of the First Webshit Incursion sagely point out that this day was inevitable from the moment the author made use of certain features that are built into the core language but regarded as unclean by the clergy. Other Hackernews chime in to point out the necessity of being extremely polite and receptive to every single message received from anyone with a Github account. Within days, development of the condemned code resumes and the twelve websites which depend on the software do not notice anything happened.

Volkswagen exec admits full self-driving cars 'may never happen'
January 18, 2020 (comments)
A German correctly assesses an engineering problem. Hackernews is convinced that they know better, because a nontrivial percentage of them receive paychecks from companies that have incorrectly assessed the engineering problem, and the rest of them because they personally cannot correctly assess any engineering problem. Half of the almost nine hundred comments are in one thread which consists entirely of arguments between people who drive Teslas once in a while in fair weather and everyone else on Earth.

My FOSS Story
January 19, 2020 (comments)
An Internet struggles to be extremely polite and receptive to every single message received from anyone with a Github account. The article contains a shitload of whining interspersed with, and followed by, admonitions that this shitty situation is in everybody's best interest. Hackernews agrees, and praises the author for arguing with people on the internet. Some Hackernews briefly experiment with the idea that Github pull requests are not the most important webshit technology ever developed, but since all the software that permits disabling them are not Github, no progress is made.

Immune discovery 'may treat all cancer'
January 20, 2020 (comments)
The British Broadcasting Coöperative weasel-words a scientific discovery into clickbait. Hackernews, in between seeing the headline and clicking on the link, sprouts several medical research doctorates and weighs in with sober analysis and whatever medical facts they remember reading recently. Later, Hackernews lists every malady that should be cured. One Hackernews thinks maybe scientific research would work better if it somehow involved pull requests from Hackernews. Even Hackernews thinks this is a terrible idea, but is insufficiently scornful to the originating idiot.

Every Google result now looks like an ad
January 21, 2020 (comments)
An advertising company blurs the line between the dumb shit you searched for and the extremely important and well-targeted information you need. Hackernews doesn't like it, which leads to the same debate Hackernews always has when Google iteratively befucks its search product: "use this other search engine!" "but that is not Google" "no, but you can make it redirect to Google." This is followed, in accordance with Hackernews tradition, by a litany of complaints about other things Google is doing to fuck the internet up for everybody; as it turns out, literally everything Google does is by now aimed at fucking the internet up.

by http://n-gate.com/hackernews/2020/01/21/0/

Data Knightmare (Italian podcast)

DK 4x19 - Scienza Defiscienza, il caso del Cavolfiore Inappropriato

Ripeto da anni che non c'è scienza nella data science. Ecco l'ultimo caso di studio, pure made in Italy

by Walter Vannini

January 19, 2020

Zero Days

Una Policy GDPR in previsione delle sanzioni

Dopo aver analizzato circa 170 sanzioni che sono state comminate, negli ultimi due anni (circa) in quasi tutti i Paesi dell'Unione Europea, ho elaborato una policy in 15 (quindici) regole che fosse molto pratica e correlata, appunto, a situazioni di irregolarità che sono già state verificate e sanzionate, almeno in "primo grado".

Questa policy è stata discussa per la prima volta la settimana scorsa davanti agli studenti del Corso di Perfezionamento in Data Protection dell'Università di Milano, e si tratta di un documento provvisorio che verrà, pian piano, rifinito ed integrato.

by Giovanni Ziccardi

January 18, 2020


Are the Shadow Brokers identical with the Second Source?

(Updated: January 18, 2020)

What a lot of people don't know, is that a range of classified documents from the NSA have not been attributed to Edward Snowden, which means that there was at least one other leaker inside the NSA.

Initially, this leaker was called the "Second Source", and although he was responsible for significant leaks, they got little attention in the US. More media coverage gained the release, since 2016, of NSA hacking tools by the mysterious "Shadow Brokers".

Now, a close look at documents published by the German magazine Der Spiegel in December 2013 provided new indications that the Second Source could be identical with the leaker behind the Shadow Brokers.

NSA's Cryptologic Center in San Antonio, Texas (2013)
(photo: William Luther - click to enlarge)

The second source

The first leak that was not attributed to Snowden, was of an internal NSA tasking record, showing that German chancellor Angela Merkel was apparently on the NSA's targeting list. The second revelation that was said to come from the same source as the Merkel record, was that of the ANT product catalog, containing a wide range of sophisticated eavesdropping gadgets and techniques.

Security expert Bruce Schneier, who was probably the first to write about the possibility of a second source, said that this source apparently passed his documents to a small group of people in Germany, including hacktivist Jacob Appelbaum and documentary film maker Laura Poitras.

Because Poitras also received one of the initial sets of documents from Snowden, it is sometimes assumed that the documents from the Second Source may actually stem from the Snowden trove, despite not being attributed as such. For some of the individual documents this was contradicted by Glenn Greenwald and Edward Snowden though.

Spiegel reportings

The ANT catalog was published by the German magazine Der Spiegel on December 29, 2013. The original article was in German and written by Jacob Appelbaum, Judith Horchert, Ole Reißmann, Marcel Rosenbach, Jörg Schindler and Christian Stöcker. A translation in English mentioned the names of Jacob Appelbaum, Judith Horchert and Christian Stöcker.

Although this catalog got most of the attention, not at least because Appelbaum explained the various tools during a presentation at the hackers conference CCC on December 30, it was actually just an addition to Der Spiegel's extensive main piece about the hacking division of the NSA, called Tailored Access Operations (TAO).

This article was written by Jacob Appelbaum, Marcel Rosenbach, Jörg Schindler, Holger Stark and Christian Stöcker, with the cooperation of Andy Müller-Maguhn, Judith Horchert, Laura Poitras and Ole Reißmann. There was also a translation in English prepared by the Spiegel staff based upon reporting "by Jacob Appelbaum, Laura Poitras, Marcel Rosenbach, Christian Stöcker, Jörg Schindler and Holger Stark."

TAO documents

This main piece was accompanied by various NSA documents: one slide about FOXACID, a partial presentation about QUANTUM, two separate pages from other documents, as well as complete powerpoint presentations about QUANTUM tasking, the TAO unit at NSA/CSS Texas, and the QFIRE architecture:

(click to go to the various documents)

Not Snowden?

Apparently never noticed before, is that not only the ANT product catalog, but also these other presentations and documents were not attributed to Snowden. In both the German and the English version, the whole lengthy article contains multiple times phrases like "internal NSA documents viewed by SPIEGEL" but never in combination with the name of Edward Snowden.

This is remarkable, because for the media, it's usually almost some kind of honor to publish documents provided by Snowden, which is then clearly mentioned in their reporting. In those cases, the byline includes the name of the one who actually provided the documents on Snowden's behalf, often Glenn Greenwald and for Der Spiegel, Laura Poitras.

But both articles from December 29 have Jacob Appelbaum, instead of Poitras in the byline, which seems to be an indication that here, the top secret NSA documents were provided by Appelbaum, likely as the middleman for the mysterious second source.

Exception: FOXACID slide

There's one exception though: the description of the FOXACID slide says that it is from an NSA presentation from the Snowden cache - this was confirmed when on August 19, 2016, The Intercept eventually published the full presentation about FOXACID.

This slide was probably provided by Laura Poitras, from her cache of Snowden documents, which would explain why she was mentioned as one of the persons that provided assistance for Der Spiegel's main piece of December 29.

The other presentations have not been published as part of the Snowden revelations, there's only one with a similar layout (from Booz Allen's SDS unit), but is about a different topic.


If not only the ANT Product Catalog, but also these other NSA presentations about the TAO division were not provided by Snowden, but by the second source, what's the significance of that?

Analysing the range of revelations that were not attributed to Snowden, resulted in the following list of documents that were likely leaked by the second source:

- Chancellor Merkel tasking record
- TAO product catalog
- XKEYSCORE rules: New Zealand
- NSA tasking & reporting France, Germany, Brazil, Japan
- XKEYSCORE agreement between NSA, BND and BfV(?)
- NSA tasking & reporting EU, Italy, UN

Except for the TAO catalog, one of the things that all these documents have in common, is that they are different from the usual powerpoint presentations, program manuals and internal wiki pages that make up the biggest part of the Snowden revelations.

(Of course, absence of evidence is no evidence of absence, but as these second source documents are often more significant than many other Snowden files, there seems to be no reason not to publish them)

The additional December 29 files do actually fit the typical sort of documents from Snowden, which makes it more difficult to distinguish between documents from Snowden and those from the other leaker(s).

The Shadow Brokers

If we look at the content of the files, we see that those from Der Spiegel's December 29 article are all about NSA's hacking operations. There have been several Snowden stories about that topic, but more spectacular became the release, since August 2016, of actual NSA hacking tools by a mysterious person or group called The Shadow Brokers (TSB or SB).

There has been a lot of speculation about who could be behind this and how he, she or they got access to these sensitive files. One option is an NSA insider, either on his own, in cooperation with crypto-anarchists, or as a mole directed by a hostile intelligence agency.

Another suggestion was that an NSA hacker mistakenly uploaded his whole toolkit to a server outside the NSA's secure networks (also called a "staging server" or "redirector" to mask its true location) and that someone was able to grab the files from there - this option was for example favored by Snowden.


The latter theory was falsified when on April 14, 2017, the Shadow Brokers did not only publish an archive containing a series of Windows exploits, but also several documents and top secret presentation slides about NSA's infiltration of the banking network SWIFT - things unlikely to be on a staging server, which makes that the source behind the Shadow Brokers is most likely an insider. Also, maybe one or two of these hacking tools may have been on a staging server for a specific mission, but not all those that were published by the Shadow Brokers?

On July 28, the website CyberScoop reported that as part of their investigation into the Shadow Brokers leaks, US government counterintelligence investigators contacted former NSA employees in an effort to identify a possible disgruntled insider.

(just a few days ago, the Shadow Brokers released a manual for the hacking framework UNITEDRAKE, strangely enough without date and classification markings, but again something that one wouldn't find on an outside staging server)

Same source?

With the documents published by the Shadow Brokers apparently being stolen by an insider at NSA, the obvious question is: could the Shadow Brokers be identical with the Second Source? (see update)

One interesting fact is that the last revelation that could be attributed to the second source occured on February 23, 2016, and that in August of that year the Shadow Brokers started with their release of hacking files. This could mean that the second source decided to publish his documents in the more distinct and noticeable way under the guise of the Shadow Brokers.

But there's probably also a much more direct connection: the batch of documents published along with Der Spiegel's main piece from December 29, 2013 include a presentation about the TAO unit at NSA's Cryptologic Center in San Antonio, Texas, known as NSA/CSS Texas (NSAT):

TAO Texas presentation, published by Der Spiegel in December 2013
(click for the full presentation)

And surprisingly, the series of three slides that were released by the Shadow Brokers on April 14 were also from NSA/CSS Texas. They show three seals: in the upper left corner those of NSA and CSS and in the upper right corner that of the Texas Cryptologic Center:

TAO Texas slide, published by the Shadow Brokers in April 2017
(click for the full presentation)


It's quite remarkable that among the hundreds of NSA documents that have been published so far, there are only these two sets from NSA/CSS Texas. This facility is responsible for operations in Latin America, the Caribbean, and along the Atlantic littoral of Africa in support of the US Southern and Central Commands.

Update: The three Shadow Brokers slides from NSA/CSS Texas show operations against EastNets and Business Computer Group (BCG), which are both Service Bureaus for the banking network SWIFT. EastNets has offices in Belgium, Jordan, Egypt and UAE and was targeted under the codename JEEPFLEA_MARKET, while BCG serves Panama and Venezuela and was targeted under JEEPFLEA_POWDER.

Besides the one in San Antonio, Texas, NSA has three other regional Cryptologic Centers in the US: in Augusta, Georgia, in Honolulu, Hawaii and in Denver, Colorado. These four locations were established in 1995 as Regional Security Operations Centers (RSOC) in order to disperse operational facilities from the Washington DC area, providing redundancy in the event of an emergency.

So far, no documents from any of these regional centers have been published, except for the two from NSA/CSS Texas. This could be a strong indication that they came from the same source - and it seems plausible to assume that that source is someone who actually worked at that NSA location in San Antonio.


This person may only have stolen files that were available at his own workplace, as it should be realized that not every leaker necessarily has similar broad access like Snowden had (and gained) in his job as a systems administrator.

Snowden on the other hand may only have downloaded things from an intranet for NSA as a whole (assuming that would contain the most interesting files) and leaving the local network for his Hawaii office untouched - which would explain why we never saw any documents marked NSA/CSS Hawaii (another reason could be that such documents would have made it easier to identify him).

Given the many hacking files, it's tempting to assume that the second source/Shadow Brokers was an NSA hacker at the Texas TAO unit. It's not clear though whether someone in such a position would also have had the access to the intelligence reports and traditional tasking lists which were published by Wikileaks. It's also possible that those documents came from a different source.


One final thing that the revelations from the second source and the Shadow Brokers seem to have in common is the motivation: none of their documents reveal serious abuses or illegal methods, but only compromise methods and operations, and discredit US intelligence.

Most of these documents weren't vetted by professional journalists either: although initially published by Der Spiegel and some other German media, later files were made public by the uncritical website Wikileaks, while the Shadow Brokers postings come without any intermediary on sites like Pastebin, Medium and Steemit.

(In March 2017, Wikileaks started the "Vault 7" series in which they publish secret hacking tools from the CIA. These files have dates between November 2003 and March 2016 and are therefore more recent that those from the Shadow Brokers, with their newest files being dated October 18, 2013 - some 5 months after Snowden left the NSA and around the same time when Der Spiegel published the first document from the second source)

Update #1:

On the weblog Emptywheel.net there are some additional thoughts about this issue: the author is, for various reasons, skeptical about the Shadow Brokers being a disgruntled NSA employee or contractor, and therefore that he could be identical with the Second Source. As an alternative, Emptywheel suggests that Jakob Appelbaum and the Shadow Brokers may have a mutually shared source.

I can agree with that, as I may not have made clear enough that the Second Source is the person who was able to actually steal documents from inside NSA, while the Shadow Brokers is a group or a single person who is responsible for publishing the files, just like Der Spiegel and Wikileaks did for most of the documents attributed to the Second Source.

Of course it would be possible that the Second Source eventually started to publish his documents himself under the covername Shadow Brokers, but as noted by Emptywheel, there are several indications that makes this less likely.

A slightly different option is that the Second Source provided his documents to Jacob Appelbaum and that he had them published by Der Spiegel and Wikileaks, and that later on, either Appelbaum himself acted as the Shadow Brokers, or gave the files to someone else operating under that guise.

Update #2:

In November 2013, the New York Times published a slide with a pie chart showing the sources of 103 collection accesses at the NSA's station in San Antonio, Texas. It's not clear though whether only this individual slide/chart is about NSA Texas, or the presentation as a whole.

Update #3:

An updated overview of the Shadow Brokers story was published by the New York Times on November 12, 2017, saying that investigators were worried that one or more leakers may still be inside NSA and also that the small number of specialists who have worked both at TAO and at the CIA came in for particular attention, out of concern that a single leaker might be responsible for both the Shadow Brokers and the files published by Wikileaks as part of their Vault7 and Vault8 series (although the CIA files are more recent).

Links and sources
- The New York Times: Security Breach and Spilled Secrets Have Shaken the N.S.A. to Its Core (2017)
- Emptywheel.net: UNITEDRAKE and hacking under FISA Orders (2017)
- Emptywheel.net: Shadow Brokers' Persistence: Where TSB has signed, message, hosted, and collected
- Schneier.com: The US Intelligence Community has a Third Leaker (2014)

by P/K (noreply@blogger.com)

January 16, 2020

Data Knightmare (Italian podcast)

DK 4x18 - Certificazioni GDPR

Cos'è una certificazione? A cosa serve? Quali sono le certificazioni per il GDPR? Spoiler: ISO27001 NON certifica la compliance al GDPR.

by Walter Vannini

January 15, 2020

n-gate.com. we can't both be right.

webshit weekly

An annotated digest of the top "Hacker" "News" posts for the second week of January, 2020.

Broot – A new way to see and navigate directory trees
January 08, 2020 (comments)
The Rust Evangelism Strike Force, using a mere five thousand lines of code (not counting the twenty remotely-imported libraries), implements a version of ls(1) that is not good at listing files. Hackernews has been looking for a file explorer with a complicated user interface for a very long time, and is extremely pleased. The author shows up, and to demonstrate their gratitude, Hackernews bickers over the Github etiquette, then lists all the other programs that have had the same functionality since the Reagan administration.

BeOS: The Alternate Universe's Mac OS X
January 09, 2020 (comments)
An Internet tries to convince us that a dead operating system was good by comparing it to a bad one. Hackernews experiences a devastating nostalgia storm, which gives way to dozens of minor skirmishes about who was a better middle manager or which 1980s computer processor could do a specific trick. Later Hackernews assert that the industry has progressed far since then, and hold up as evidence a pile of reimplementations of the 1990s software, but with webshit.

VVVVVV’s source code is now public, 10 year anniversary jam happening now
January 10, 2020 (comments)
A computer game gets naked. Hackernews tries to understand how the computer game attained popularity despite not adhering to received programming wisdom. Later, Hackernews mourns the death of Adobe Flash, and investigates the sixteen thousand half-assed replacements that have popped up over the years.

Goodbye, Clean Code
January 11, 2020 (comments)
A webshit seizes the means of abstraction. Having just read an article on the topic of inappropriate rigor in software design, Hackernews sets about generating an inappropriately rigorous metric for deciding when rigor may be inappropriate. In the rest of the discussions, Hackernews debates whether it's appropriate to spend company money duplicating a colleague's work because you didn't like the shape of the text.

Deploy your side-projects at scale for basically nothing – Google Cloud Run
January 12, 2020 (comments)
A webshit enthusiastically emits fifteen hundred words describing Google's reimplementation of CGI, which works absolutely wonderfully as long as nobody uses any of your websites. Hackernews is excited about the possibilities of CGI++, but concerned because (like all cloud services) there's no way to cap expenditures and nobody at the hosting company gives a shit about you or your problems. The cloud apologists arrive to assure everyone that neither of these are problems and you should just relax and move your products into this service ASAP. It's 2020, after all.

iOS 13 app tracking alert has dramatically cut location data flow to ad industry
January 13, 2020 (comments)
Apple successfully reroutes the digital oil pipelines to their own refineries. Hackernews declares that Tim Cook is the chosen one, foretold to defend us from their day jobs. Some light Android astroturfing occurs in response, but is blown away in the relentless gale of hot air about how much Apple cares about us on an intimate, familial level.

Patch Critical Cryptographic Vulnerability in Microsoft Windows [pdf]
January 14, 2020 (comments)
The National Security Agency, for the first time in recorded history, contributes as an Agency to the Security of the Nation. Hackernews is always delighted at the opportunity to be seen in public trying to understand cryptography, even if the lispy webshit they're using isn't really up to the task. Other Hackernews realize, with creeping horror, that Let's Encrypt did not, in fact, protect them against advanced persistent threats, despite the repeated whining from peripheral security-theater hucksters. When will these assholes learn that the only way to respond to a hostile government is to overthrow it? Stay tuned to find out.

by http://n-gate.com/hackernews/2020/01/14/0/

Zero Days

Un'introduzione alle fake news e alla disinformazione

Il tema delle fake news è estremamente attuale, e pone tantissimi problemi nella società moderna.

In questo podcast introduciamo la tematica analizzando cosa sono le fake news, come si diffondono, come si integrano perfettamente nel sistema dei social network e quali possano essere i rimedi.

L'attenzione allo spirito critico dell'utente risulta, in particolare, centrale, anche se molto difficile da attuare in un contesto come quello dei social network e dell'attuale comunicazione "istantanea" digitale.

Al termine di questo podcast l'ascoltatore avrà appreso alcune nozioni di base che permetteranno, poi, approfondimenti più curati, anche su temi specifici.

by Giovanni Ziccardi

January 10, 2020

Institute of Network Cultures

My body, my traitor: essay in 10TAL

Last year Swedish magazine 10TAL published an essay of mine about data mining, the body, and self-knowledge. Now, the English translation has been published too! Read the opening below or click through to the whole text: My body, my traitor


We’ve come a long way, baby. From »On the internet nobody knows you’re a dog«, to »On the internet everybody knows I’m a top dog«, to »On the internet we’re all Pavlov’s dogs«. Or: from the homepage, to the social media identity, to the algorithmic profile. From the nerd, to the networked self, to the passive data goldmine (via the influencer).

This evolution can be read as a story of increasing corporality, as counterintuitive as that may seem. Usually, the story is told as if the world and its inhabitants are on their way to shedding all bodily weight, with the end-point on the horizon being a purely computerized humankind, all Mind and no Matter. But the inextricable entanglement of »online« and »offline«, »virtual« and »real«, primarily means that technology is all the time effecting the body (and thus, via the body, the soul). Like Pavlov’s dog, the post-digital condition is no »cloud« in which everything evaporates (and, as we know, what is called the cloud is a very material infrastructure that is using up as much energy as a small country, relying on cables that land on contested shores, demanding ever more precious metals to be mined from unstable regions). Rather, our bodies and the data that can be mined from them, function as the pathways to understanding, predicting and thus controlling or manipulating the world, which in the same gesture means understanding, predicting and thus controlling or manipulating the body, the very body that was mined in the first place.

In a catch-22 situation, you’re always made an accomplice to your own submission.

Continue reading over at 10TAL: My body, my traitor

by Miriam Rasch

January 09, 2020

Institute of Network Cultures

Column – Take root among the stars: If Octavia Butler wrote design fiction

Authors: Gabriele Ferri, Inte Gloerich

“The Butler Timeline (BT) is a parallel universe where renowned speculative fiction author Octavia E. Butler engaged in a critical dialogue with researchers in human-computer interaction, shaping the genre of design fiction differently from how it unfolded in our timeline. Here we present a meta-speculation, imagining what could have been different if Butler, a prominent African American writer and intellectual, played a key role in establishing speculative design research. We do not want to create a temporal paradox but, if we had a transdimensional portal, we could simply observe how speculative research came to be in the BT. Hopefully, this could suggest another way of doing design fiction in our own reality, with a different ideology and purpose. That is why we volunteer for this interdimensional travel.”

Read the whole column here.

Interactions XXVII.1 (January – February 2020)

by Inte Gloerich

January 08, 2020

n-gate.com. we can't both be right.

webshit weekly

An annotated digest of the top "Hacker" "News" posts for the first week of January, 2020.

Software Disenchantment (2018)
January 01, 2020 (comments)
A webshit gets blacklisted from Google I/O. Hackernews spends several hours bikeshedding a car analogy, with an eye toward making excuses for the idiocies outlined in the article. Another Hackernews decides to paste year-old Reddit comments. The rest of the comments follow a general pattern: Hackernews disagrees with the premise, regards it as unrealistic, but whatever software pissed a given Hackernews off recently is the exception and should be eradicated.

Which Emoji Scissors Close?
January 02, 2020 (comments)
An Internet is either bored or salaried. Hackernews posts links to similar obsessions about cellphone cartoons.

EA is permanently banning Linux players on Battlefield V
January 03, 2020 (comments)
Some idiots give money to assholes and get what's coming. Hackernews discusses the core of the problem (people are assholes) and the only possible response to the situation (deal with it or else fuck off). Many Hackernews list the computer games they have played, and reminisce about solutions that worked in the past, but each of them in turn is told that none of those solutions apply any more, or are not worth implementing, and the market dictates that "deal with it or else fuck off" is really best for everyone.

Building a BitTorrent client from the ground up in Go
January 04, 2020 (comments)
A webshit annotates a barely-functioning implementation of a popular piracy tool via Microsoft Paint. Hackernews is unsatisfied with merely getting cartoons about the important parts of the tool; they demand access to the software bureaucracy as well. A discussion breaks out about whether tools like this are doomed to be replaced by webshit reimplementations. The rest of the commenters want to learn Python.

Ask HN: Are books worth it?
January 04, 2020 (comments)
A Hackernews tries to figure out if there's anything important locked away behind arcane NoSQL implementations. Hackernews proceeds to teach one another to read. Not the mechanics of translating linguistic symbols to ideas and information, but the actual practice of identifying, selecting, and reading physical objects. Most of the comments are not contained in threads; instead, Hackernews decided to reply to the original question with over two hundred top-level replies, almost none of which garner a single response. Digging to the bottom of that barrel surfaces a fascinating collection of troglodytes who consider books to be actually dangerous. Finally there's the idiot who complains that phone apps which present ten-minute abstracts of entire books "condensed too much."

Top Paying Tech Companies by SWE Level
January 05, 2020 (comments)
Some webshits dish about scratch. Venture-backed money incinerators vastly overpay junior engineers, while advertising-cum-surveillance networks overpay senior engineers. Pinterest appears in the resulting charts, demonstrating that spamming Google image search results remains a steady source of salaried income for assholes. Hackernews argues about whether any of the numbers are real and how to get someone to give them that much money. No technology is discussed.

Ask HN: I've been slacking off at Google for 6 years. How can I stop this?
January 05, 2020 (comments)
A Google wants to get a real job. Hackernews recommends against this, instead advising the Google to continue grifting corporate money and fornicate more. One Hackernews strongly supports this method, as actual work is apparently to blame for alcoholism and impending suicide. A gratifying number of other Hackernews reach out to offer moral support to this individual. The next comment thread is full of hints and tips at how to snow disinterested managers into thinking you're worth promoting, despite being a net drag on productivity. Each of these threads are nearly three hundred comments long, chock full of Hackernews asking and answering how to be a total piece of shit and still skate through personnel reviews. The rest of the comments are Hackernews struggling with the concept of "meaningful work," and whether that idea is something even remotely possible in real life.

Products I Wish Existed
January 06, 2020 (comments)
Some rich asshole posts a TODO list for someone else to use to make the asshole richer. Almost all of the things on the list exist; a better title would have been "Markets I Wish I Had Invested In." Hackernews selects entries from the list and agrees with them. Along the way, Hackernews invents Lockheed, Friendster, Scratch, Digg, call centers, "premium" tech support, and Mrs. Grundy.

The first chosen-prefix collision for SHA-1
January 07, 2020 (comments)
Cryptographers continue to cryptographate. Math is here, and so Hackernews spends several hours incorrecting one another about the contents of the article, even though it was written in plain English at a high school level (presumably the authors have encountered the internet before). Hackernews tries to muster up some optimism regarding their industry, but then the analogies show up, and several hours are lost trying to navigate back out of that quicksand. The rest of the comments are Hackernews asking each other if the two or three programs they actually use are subject to the described attack. Hackernews thinks they are not. They're not sure. But they're pretty sure.

by http://n-gate.com/hackernews/2020/01/07/0/