nikhil.io

The Pithy Wisdom of Stephen Crowley

Stephen Crowley is a product designer who maintains @ShitUserStory, my favorite new Twitter account1 (via Deepu). He also maintains a Medium blog with gems like these:

Lovely stuff.

  1. Given the amount of rage I’ve had with awful product design and really, really shitty websites of late. The madness doesn’t stop with the web. On $250 Sony WH-1000XM4 headphones, which are comfortable and have lovely sound and the best active noise-canceling I’ve ever experienced, opting to “Disable Voice Guidance” still means that the nice lady inside your headphones will tell you when you dis/connect your Bluetooth device. You gotta toggle a feature in the app to prevent iTunes from launching every time it pairs with your Mac (the Sony folk think this “feature” is “unfortunate” so there’s that at least.) Your headphones can just choose to turn off the moment you turn them on unless you update the firmware. Would you like to share your location? Do you want the Sony app to send you notifications? We’ll need the last four digits of your SSN so we can create a tailor-made listening profile for you. Is that OK?↩︎

The Lighthouse

The Lighthouse (2019)

IMDb

Rating: A

"Robert Pattinson said to me before agreeing to this, ‘I don’t want to make a movie about a magical lighthouse. I want to make a movie about a fucking crazy person.’”

Jess Joho, “What the hell did ‘The Lighthouse’ even mean?”, Mashable

Saw with LD. Noir, Jung, myth and mythology, Proteus, Prometheus, masculinity, sexuality, very large phallus, isolation, identity, lobster dinners, psychosis, mermaids, flatulence, alcoholism, omens, portents, songs, odes, the-father-is-the-son-is-the-father, Rime of the Ancient Mariner.

“Every Frame a Painting” and it somehow manages to be quite funny at times. Oh and this (emphasis mine):

Underneath the jargon and flatulence, the film is mostly concerned with identity.

Vinnie Mancuso, “‘The Lighthouse’ Ending Feeds Myth and Symbolism to the Birds”, Collider

Bravo!

On ‘Deliberate’ Genocide in the Americas

by CommodoreCoCo

Responding to this chilling comment:

You are failing to understand genocide itself. INTENT, is the word, DELIBERATION. Deliberation to destroy an ethnic group. There was NEVER a deliberate attempt to destroy native culture in the Americas. In fact, you have laws since the 1512 protecting their rights and equalising them to Iberian Crown subjects, “Las Leyes de Burgos”.

Because, you see, unintentional genocide is A-OK.

I see I’ve been summoned. Your comments in this thread make it clear that nothing will change your position. It’s a difficult position to combat, because it’s in such a defiance of literally anything written on the topic in at least the last 50 years. You are not operating off the same foundations of evidence that others are, and for that reason I suspect they, like me, are not terribly interested in arguing. Because it’s unlikely your drivel will be removed, I’m posting some quotes and links for those who see this thread later and think you might have even begun to approach a point supported by any specialist on the topic. I do not intend these to be comprehensive; there are myriad examples of “deliberate attempts to destroy native culture in the Americas” in, well, literally any single book or article you can pick up about the era. Rather, because you’ve instead there never was any such thing, I’ve provided some obvious examples.


A primary goal of the Spanish colonial regime was to completely extirpate indigenous ways of life. While this was nominally about conversion to Catholicism, those in charge made it quite explicit that “conversion” not only should be but needed to be a violent process. Everything potentially conceivable as an indigenous practice, be it burial rituals, ways to build houses, or farming technologies, was targeted, To quote historian Peter Gose:

only by rebuilding Indian life from the ground up, educating, and preventing (with force if necessary) the return to idolatry could the missionary arrest these hereditary inclinations and modify them over time.

Francisco de Toledo, Viceroy of Peru, made clear in a 1570 decree that failure to comply with Catholicism was an offense punishable by death and within secular jurisdiction:

And should it occur that an infidel dogmatizer be found who disrupts the preaching of the gospel and manages to pervert the newly converted, in this case secular judges can proceed against such infidel dogmatizers, punishing them with death or other punishments that seem appropriate to them, since it is declared by congresses of theologians and jurists that His Majesty has convened in the Kingdoms of Spain that not only is this just cause for condemning such people to death, but even for waging war against a whole kingdom or province with all the death and damage to property that results

The same Toledo decreed in 1580 that Catholic priests and secular judges and magistrates should work together to destroy indigenous burial sites:

I order and command that each magistrate ensure that in his district all the tower tombs be knocked down, and that a large pit be dug into which all of the bones of those who died as pagans be mixed together, and that special care be taken henceforth to gather the intelligence necessary to discover whether any of the baptized are buried outside of the church, with the priest and the judge helping each other in such an important matter

Not only was the destruction of native culture a top-down decree, resistance was explicitly a death sentence.


The contemporary diversity of Latin America is not the result of natural “intermixing,” but the failure of the Spanish to assert themselves and the continuous resistance of the indigenous population. As early as 1588, we see letters from local priests airing grievances about the failure of the reduccion towns they were supposed to relocate native families to:

‘the corregidores are obliged, and the governors, to reduce the towns and order them reduced, and to build churches, take care to find out if the people come diligently for religious instruction and mass, to make them come and help the priest, and punish the careless, lazy, and bad Indians in the works of Christianity, as the ordinances of don Francisco de Toledo require, [but] they do not comply. Rather, many of the towns have yet to be reduced, and many churches are yet to be built, and a large part of the Indians are fled to many places where they neither see a priest nor receive religious instruction.

Reduccion was not a voluntary process, nor was it a question of simply “moving away.” Not only did it involve the destruction of native religious sites, it frequently involved the destruction of entire towns to repurpose building material and ensure people could not return. In fact, where we do see more voluntary participation in Spanish colonial structures, usually because of the political legibility and opportunities it provided, the resulting syncretism becomes an ever greater source of anxiety for the Spanish. Indigenous elites could selectively participate in Catholicism and game the system to their benefit- not something the state wanted to admit could happen.

These quotes come from Gose’s chapter on reducciones uploaded here.

I will also provide this section from the conclusion of Nicholas Robins’ book Mercury, Mining, and Empire; the entirety is uploaded here. The quoted chunk below is a summary of the various historical events presented in that chapter.

The white legend held much historiographical sway throughout the nineteenth and much of the twentieth centuries, and in no small part reflected a selective focus on legal structures rather than their application, subsumed in a denigratory view of native peoples, their cultures, and their heritage. As later twentieth-century historians began to examine the actual operation of the colony, the black legend again gained ascendance. As Benjamin Keen wrote, the black legend is “no legend at all.

Twentieth-century concepts of genocide have superseded this debate, and the genocidal nature of the conquest is, ironically, evident in the very Spanish laws that the advocates of the white legend used in their efforts to justify their position. Such policies in Latin America had a defining influence on Rafael Lemkin, the scholar who first developed the term genocide in Axis Rule in Occupied Europe. As developed by Lemkin, “Genocide has two phases: one, destruction of the national pattern of the oppressed group; the other, the imposition of the national pattern of the oppressor,” which often included the establishment of settler colonies. Because of the intimate links between culture and national identity, Lemkin equated intentional cultural destruction with genocide. It was in no small part a result of his tireless efforts that in 1948 the United Nations adopted the defintion of genocide which, despite its shortcomings, serves today as international law. The fact that genocide is a modern concept and that colonists operated within the “spirit of the times” in no way lessens the genocidal nature of their actions. It was, in fact, historical genocides, including those in Latin America, that informed Lemkin’s thinking and gave rise to the term.

Dehumanization of the victim is the handmaiden of genocide, and that which occurred in Spanish America is no exception. Although there were those who recognized the humanity of the natives and sought to defend them, they were in the end a small minority. The image of the Indian as a lazy, thieving, ignorant, prevaricating drunkard who only responded to force was, perversely, a step up from the ranks of nonhumans in which they were initially cast. The official recognition that the Indians were in fact human had little effect in their daily lives, as they were still treated like animals and viewed as natural servants by non-Indians. It is remarkable that the white legend could ever emerge from this genocidogenic milieu. With the path to genocide thus opened by the machete of dehumanization, Spanish policies to culturally destroy and otherwise subject the Amerindians as a people were multifaceted, consistent, and enduring. Those developed and implemented by Viceroy Francisco de Toledo in Peru in the 1570s have elevated him to the status of genocidier extraordinaire.

Once an Indian group had refused to submit to the Spanish crown, they could be legally enslaved, and calls for submission were usually made in a language the Indians did not understand and were often out of earshot. In some cases, the goal was the outright physical extermination or enslavement of specific ethnic groups whom the authorities could not control, such as the Chiriguano and Araucanian Indians. Another benefit from the crown’s perspective was that restive Spaniards and Creoles could be dispatched in such campaigns, thus relieving cities and towns of troublemakers while bringing new lands and labor into the kingdom. Ironically, de Toledo’s campaign to wipe out the Chiriguano contributed to his own ill health. Overall, however, genocidal policies in the Andes and the Americas centered on systematic cultural, religious, and linguistic destruction, forced labor, and forced relocation, much of which affected reproduction and the ability of individuals and communities to sustain themselves.

The forced relocation of Indians from usually spread-out settlements into reducciones, or Spanish-style communities, had among its primary objectives the abolition of indigenous religious and cultural practices and their replacement with those associated with Catholicism. As native lands and the surrounding geographical environment had tremendous spiritual significance, their physical removal also undermined indigenous spiritual relationships. Complementing the natives’ spiritual and cultural control was the physical control, and thus access to labor, offered by the new communities. The concentration of people also inadvertently fostered the spread of disease, giving added impetus to the demographic implosion. Finally, forced relocation was a direct attack on traditional means of sustenance, as many kin groups settled in and utilized the diverse microclimates of the region to provide a variety of foodstuffs and products for the group.

Integrated into this cultural onslaught were extirpation campaigns designed to seek out and destroy all indigenous religious shrines and icons and to either convert or kill native religious leaders. The damage matched the zeal and went to the heart of indigenous spiritual identity. For example, in 1559, an extirpation drive led by Augustinian friars resulted in the destruction of about 5,000 religious icons in the region of Huaylas, Peru, alone. Cultural destruction, or ethnocide, also occurred on a daily basis in Indian villages, where the natives were subject to forced baptism as well as physical and financial participation in a host of Catholic rites. As linchpins in the colonial apparatus, the clergy not only focused on spiritual conformity but also wielded formidable political and economic power in the community. Challenges to their authority were quickly met with the lash, imprisonment, exile, or the confiscation of property.

Miscegenation, often though not always through rape, also had profound personal, cultural, and genetic impacts on indigenous people. Part of the reason was the relative paucity of Spanish women in the colony, while power, opportunity, and impunity also played important roles. Genetic effacement was, in the 1770s, complemented by efforts to illegalize and eliminate native languages. A component in the wider effort to deculturate the indigenes, such policies were implemented with renewed vigor following the Great Rebellion of 1780–1782. Such laws contained provisions making it illegal to communicate with servants in anything but Spanish, and any servant who did not promptly learn the language was to be fired. The fact that there are still Indians in the Andes does not diminish the fact that they were victims of genocide, for few genocides are total.

Lastly, I would direct readers to the following article: Levene, Mark. 1999. “The Chittagong Hill Tracts: A Case Study in the Political Economy of ‘Creeping’ Genocide.” Third World Quarterly 20 (2): 339–69.

Though it talks about events a world away, it’s discussion of genocide is pertinent here. From the abstract:

The destruction of indigenous, tribal peoples in remote and/or frontier regions of the developing world is often assumed to be the outcome of inexorable, even inevitable forces of progress. People are not so much killed, they become extinct. Terms such as ethnocide, cultural genocide or developmental genocide suggest a distinct form of ‘off the map’ elimination which implicitly discourages comparison with other acknowledged examples of genocide. By concentrating on a little-known case study, that of the Chittagong Hill Tracts (CHT) in Bangladesh, this article argues that this sort of categorisation is misplaced. Not only is the destruction or attempted destruction of fourth world peoples central to the pattern of contemporary genocide but, by examining such specific examples, we can more clearly delineate the phenomenon’s more general wellsprings and processes. The example of the CHT does have its own peculiar features; not least what has been termed here its ‘creeping’ nature. In other respects, however, the efforts of a new nation-state to overcome its structural weaknesses by attempting a forced-pace consolidation and settlement of its one, allegedly, unoccupied resource-rich frontier region closely mirrors other state-building, developmental agendas which have been confronted with communal resistance. The ensuing crisis of state–communal relations, however, cannot be viewed in national isolation. Bangladesh’s drive to develop the CHT has not only been funded by Western finance and aid but is closely linked to its efforts to integrate itself rapidly into a Western dominated and regulated international system. It is in these efforts ‘to realise what is actually unrealisable’ that the relationship between a flawed state power and genocide can be located.

Genocide need not be a state program uniquely articulated to eliminate a people or their culture. Rather, it is often disguised in the name “progress” or “development.” This connects to the Spanish colonial economic system, based on what Robins (above) calls the “ultra-violence” of forced labor in mines.

Aaron Rodgers

The sentiment inside The Orange Sphere of Shit, by this genius (who is treating himself with Ivermectin.)

Aaron Rodgers by Ben Garrison

QL made some observations:

  1. This is an incomplete pass.
  2. It’s probably unsportsmanlike conduct penalty
  3. It’s at least intentional grounding.
  4. It does no good, only hurts the rest of the team.
  5. The vaxxed player would be wearing a cup (you know, because they’re actually protected).
  6. What makes it “accurate” is that the whole point is to hurt another person.

That’ll show 'em.

Eliminating Distractions with MS-DOS

The Dune screenplay was written on MS-DOS on a program app called “Movie Master”. It has a 40 page limit which helps the writer, Eric Roth.

Writing is fundamentally about putting your ass in the chair and typing the words. Eliminating distractions (I’ve checked Twitter at least five times while writing this short blog) is a key to success. Nothing eliminates distractions like a stripped down simple program with no internet access. Roth also said the 40 page limit helps him structure his screenplays.“I like it because it makes acts,” he said. “I realize if I hadn’t said it in 40 pages I’m starting to get in trouble.”

Matthew Gault, “The ‘Dune’ Screenplay Was Written in MS-DOS”, Vice

Former Member of “Elite-Strike Force” Legal Team now on Illustrious Client’s No-Go List to Save him the Embarrassment and Discomfiture of his Association with Her

Forgot to add this to my Collection of Shitkraken.

Two lawyers who currently work for Trump or in the former president’s inner orbit say they want absolutely nothing to do with her and have cautioned others in MAGAland to do the same. One said they’d recently deleted her phone number.

Two other people familiar with the matter said that ever since he left office in January, certain advisers and longtime associates to Trump have kept an informal shortlist of people who they should look out for, including at Trump’s private clubs or offices in Florida, New Jersey, and New York. The point of this roster is to intercept and possibly rebuff attempted outreach, visits, or phone calls from a handful of conservative figures who could bring the ex-president more undesired headaches.

“Sidney is very much on the no-go list,” one of the knowledgeable sources said. “Her problems right now do not need to be the [former] president’s problems.”

Powell’s legal exposure right now is, of course, massive. And ever since she tried to work with Trump to orchestrate a coup last year against Joe Biden, feelings of frustration and bitterness have lingered between Trump and Powell. According to a source with direct knowledge of the matter, since December Powell has privately talked about how disappointed she was in Trump because he didn’t end up appointing her to a “special” role in his White House where she would have probed “election fraud” conspiracy theories during the final days of his term.

“She sounded pretty broken up about it,” this person noted. “I felt sorry for her.”

Meanwhile, the Brave Leader of the Elite Strike-Force Team feels bad about his own ban from State Television.

The Art of Node

by maxogden

A fantastic introduction to Node (for maybe someone coming in from Python or Ruby land.)

The Art of Node

An introduction to Node.js

This document is intended for readers who know at least a little bit of a couple of things:

  • a scripting language like JavaScript, Ruby, Python, Perl, etc. If you aren’t a programmer yet then it is probably easier to start by reading JavaScript for Cats. :cat2:
  • git and github. These are the open source collaboration tools that people in the node community use to share modules. You just need to know the basics. Here are three great intro tutorials: 1, 2, 3

Table of contents

Learn node interactively

In addition to reading this guide it’s super important to also bust out your favorite text editor and actually write some node code. I always find that when I just read some code in a book it never really clicks, but learning by writing code is a good way to grasp new programming concepts.

NodeSchool.io

NodeSchool.io is a series of free + open source interactive workshops that teach you the principles of Node.js and beyond.

Learn You The Node.js is the introductory NodeSchool.io workshop. It’s a set of programming problems that introduce you to common node patterns. It comes packaged as a command line program.

learnyounode

You can install it with npm:

# install
npm install learnyounode -g

# start the menu
learnyounode

Understanding node

Node.js is an open source project designed to help you write JavaScript programs that talk to networks, file systems or other I/O (input/output, reading/writing) sources. That’s it! It is just a simple and stable I/O platform that you are encouraged to build modules on top of.

What are some examples of I/O? Here is a diagram of an application that I made with node that shows many I/O sources:

server diagram

If you don’t understand all of the different things in the diagram it is completely okay. The point is to show that a single node process (the hexagon in the middle) can act as the broker between all of the different I/O endpoints (orange and purple represent I/O).

Usually building these kinds of systems is either:

  • difficult to code but yields super fast results (like writing your web servers from scratch in C)
  • easy to code but not very speedy/robust (like when someone tries to upload a 5GB file and your server crashes)

Node’s goal is to strike a balance between these two: relatively easy to understand and use and fast enough for most use cases.

Node isn’t either of the following:

  • A web framework (like Rails or Django, though it can be used to make such things)
  • A programming language (it uses JavaScript but node isn’t its own language)

Instead, node is somewhere in the middle. It is:

  • Designed to be simple and therefore relatively easy to understand and use
  • Useful for I/O based programs that need to be fast and/or handle lots of connections

At a lower level, node can be described as a tool for writing two major types of programs:

  • Network programs using the protocols of the web: HTTP, TCP, UDP, DNS and SSL
  • Programs that read and write data to the filesystem or local processes/memory

What is an “I/O based program”? Here are some common I/O sources:

  • Databases (e.g. MySQL, PostgreSQL, MongoDB, Redis, CouchDB)
  • APIs (e.g. Twitter, Facebook, Apple Push Notifications)
  • HTTP/WebSocket connections (from users of a web app)
  • Files (image resizer, video editor, internet radio)

Node does I/O in a way that is asynchronous which lets it handle lots of different things simultaneously. For example, if you go down to a fast food joint and order a cheeseburger they will immediately take your order and then make you wait around until the cheeseburger is ready. In the meantime they can take other orders and start cooking cheeseburgers for other people. Imagine if you had to wait at the register for your cheeseburger, blocking all other people in line from ordering while they cooked your burger! This is called blocking I/O because all I/O (cooking cheeseburgers) happens one at a time. Node, on the other hand, is non-blocking, which means it can cook many cheeseburgers at once.

Here are some fun things made easy with node thanks to its non-blocking nature:

Core modules

Firstly I would recommend that you get node installed on your computer. The easiest way is to visit nodejs.org and click Install.

Node has a small core group of modules (commonly referred to as ‘node core’) that are presented as the public API that you are intended to write programs with. For working with file systems there is the fs module and for networks there are modules like net (TCP), http, dgram (UDP).

In addition to fs and network modules there are a number of other base modules in node core. There is a module for asynchronously resolving DNS queries called dns, a module for getting OS specific information like the tmpdir location called os, a module for allocating binary chunks of memory called buffer, some modules for parsing urls and paths (url, querystring, path), etc. Most if not all of the modules in node core are there to support node’s main use case: writing fast programs that talk to file systems or networks.

Node handles I/O with: callbacks, events, streams and modules. If you learn how these four things work then you will be able to go into any module in node core and have a basic understanding about how to interface with it.

Callbacks

This is the most important topic to understand if you want to understand how to use node. Nearly everything in node uses callbacks. They weren’t invented by node, they are just part of the JavaScript language.

Callbacks are functions that are executed asynchronously, or at a later time. Instead of the code reading top to bottom procedurally, async programs may execute different functions at different times based on the order and speed that earlier functions like http requests or file system reads happen.

The difference can be confusing since determining if a function is asynchronous or not depends a lot on context. Here is a simple synchronous example, meaning you can read the code top to bottom just like a book:

var myNumber = 1
function addOne() { myNumber++ } // define the function
addOne() // run the function
console.log(myNumber) // logs out 2

The code here defines a function and then on the next line calls that function, without waiting for anything. When the function is called it immediately adds 1 to the number, so we can expect that after we call the function the number should be 2. This is the expectation of synchronous code - it sequentially runs top to bottom.

Node, however, uses mostly asynchronous code. Let’s use node to read our number from a file called number.txt:

var fs = require('fs') // require is a special function provided by node
var myNumber = undefined // we don't know what the number is yet since it is stored in a file

function addOne() {
  fs.readFile('number.txt', function doneReading(err, fileContents) {
    myNumber = parseInt(fileContents)
    myNumber++
  })
}

addOne()

console.log(myNumber) // logs out undefined -- this line gets run before readFile is done

Why do we get undefined when we log out the number this time? In this code we use the fs.readFile method, which happens to be an asynchronous method. Usually things that have to talk to hard drives or networks will be asynchronous. If they just have to access things in memory or do some work on the CPU they will be synchronous. The reason for this is that I/O is reallyyy reallyyy sloowwww. A ballpark figure would be that talking to a hard drive is about 100,000 times slower than talking to memory (e.g. RAM).

When we run this program all of the functions are immediately defined, but they don’t all execute immediately. This is a fundamental thing to understand about async programming. When addOne is called it kicks off a readFile and then moves on to the next thing that is ready to execute. If there is nothing to execute node will either wait for pending fs/network operations to finish or it will stop running and exit to the command line.

When readFile is done reading the file (this may take anywhere from milliseconds to seconds to minutes depending on how fast the hard drive is) it will run the doneReading function and give it an error (if there was an error) and the file contents.

The reason we got undefined above is that nowhere in our code exists logic that tells the console.log statement to wait until the readFile statement finishes before it prints out the number.

If you have some code that you want to be able to execute over and over again, or at a later time, the first step is to put that code inside a function. Then you can call the function whenever you want to run your code. It helps to give your functions descriptive names.

Callbacks are just functions that get executed at some later time. The key to understanding callbacks is to realize that they are used when you don’t know when some async operation will complete, but you do know where the operation will complete — the last line of the async function! The top-to-bottom order that you declare callbacks does not necessarily matter, only the logical/hierarchical nesting of them. First you split your code up into functions, and then use callbacks to declare if one function depends on another function finishing.

The fs.readFile method is provided by node, is asynchronous, and happens to take a long time to finish. Consider what it does: it has to go to the operating system, which in turn has to go to the file system, which lives on a hard drive that may or may not be spinning at thousands of revolutions per minute. Then it has to use a magnetic head to read data and send it back up through the layers back into your javascript program. You give readFile a function (known as a callback) that it will call after it has retrieved the data from the file system. It puts the data it retrieved into a javascript variable and calls your function (callback) with that variable. In this case the variable is called fileContents because it contains the contents of the file that was read.

Think of the restaurant example at the beginning of this tutorial. At many restaurants you get a number to put on your table while you wait for your food. These are a lot like callbacks. They tell the server what to do after your cheeseburger is done.

Let’s put our console.log statement into a function and pass it in as a callback:

var fs = require('fs')
var myNumber = undefined

function addOne(callback) {
  fs.readFile('number.txt', function doneReading(err, fileContents) {
    myNumber = parseInt(fileContents)
    myNumber++
    callback()
  })
}

function logMyNumber() {
  console.log(myNumber)
}

addOne(logMyNumber)

Now the logMyNumber function can get passed in as an argument that will become the callback variable inside the addOne function. After readFile is done the callback variable will be invoked (callback()). Only functions can be invoked, so if you pass in anything other than a function it will cause an error.

When a function gets invoked in javascript the code inside that function will immediately get executed. In this case our log statement will execute since callback is actually logMyNumber. Remember, just because you define a function it doesn’t mean it will execute. You have to invoke a function for that to happen.

To break down this example even more, here is a timeline of events that happen when we run this program:

  • 1: The code is parsed, which means if there are any syntax errors they would make the program break. During this initial phase, fs and myNumber are declared as variables while addOne and logMyNumber are declared as functions. Note that these are just declarations. Neither function has been called nor invoked yet.
  • 2: When the last line of our program gets executed addOne is invoked with the logMyNumber function passed as its callback argument. Invoking addOne will first run the asynchronous fs.readFile function. This part of the program takes a while to finish.
  • 3: With nothing to do, node idles for a bit as it waits for readFile to finish. If there was anything else to do during this time, node would be available for work.
  • 4: As soon as readFile finishes it executes its callback, doneReading, which parses fileContents for an integer called myNumber, increments myNumber and then immediately invokes the function that addOne passed in (its callback), logMyNumber.

Perhaps the most confusing part of programming with callbacks is how functions are just objects that can be stored in variables and passed around with different names. Giving simple and descriptive names to your variables is important in making your code readable by others. Generally speaking in node programs when you see a variable like callback or cb you can assume it is a function.

You may have heard the terms ‘evented programming’ or ‘event loop’. They refer to the way that readFile is implemented. Node first dispatches the readFile operation and then waits for readFile to send it an event that it has completed. While it is waiting node can go check on other things. Inside node there is a list of things that are dispatched but haven’t reported back yet, so node loops over the list again and again checking to see if they are finished. After they finished they get ‘processed’, e.g. any callbacks that depended on them finishing will get invoked.

Here is a pseudocode version of the above example:

function addOne(thenRunThisFunction) {
  waitAMinuteAsync(function waitedAMinute() {
    thenRunThisFunction()
  })
}

addOne(function thisGetsRunAfterAddOneFinishes() {})

Imagine you had 3 async functions a, b and c. Each one takes 1 minute to run and after it finishes it calls a callback (that gets passed in the first argument). If you wanted to tell node ‘start running a, then run b after a finishes, and then run c after b finishes’ it would look like this:

a(function() {
  b(function() {
    c()
  })
})

When this code gets executed, a will immediately start running, then a minute later it will finish and call b, then a minute later it will finish and call c and finally 3 minutes later node will stop running since there would be nothing more to do. There are definitely more elegant ways to write the above example, but the point is that if you have code that has to wait for some other async code to finish then you express that dependency by putting your code in functions that get passed around as callbacks.

The design of node requires you to think non-linearly. Consider this list of operations:

read a file
process that file

If you were to turn this into pseudocode you would end up with this:

var file = readFile()
processFile(file)

This kind of linear (step-by-step, in order) code isn’t the way that node works. If this code were to get executed then readFile and processFile would both get executed at the same exact time. This doesn’t make sense since readFile will take a while to complete. Instead you need to express that processFile depends on readFile finishing. This is exactly what callbacks are for! And because of the way that JavaScript works you can write this dependency many different ways:

var fs = require('fs')
fs.readFile('movie.mp4', finishedReading)

function finishedReading(error, movieData) {
  if (error) return console.error(error)
  // do something with the movieData
}

But you could also structure your code like this and it would still work:

var fs = require('fs')

function finishedReading(error, movieData) {
  if (error) return console.error(error)
  // do something with the movieData
}

fs.readFile('movie.mp4', finishedReading)

Or even like this:

var fs = require('fs')

fs.readFile('movie.mp4', function finishedReading(error, movieData) {
  if (error) return console.error(error)
  // do something with the movieData
})

Events

In node if you require the events module you can use the so-called ‘event emitter’ that node itself uses for all of its APIs that emit things.

Events are a common pattern in programming, known more widely as the ‘observer pattern’ or ‘pub/sub’ (publish/subscribe). Whereas callbacks are a one-to-one relationship between the thing waiting for the callback and the thing calling the callback, events are the same exact pattern except with a many-to-many API.

The easiest way to think about events is that they let you subscribe to things. You can say ‘when X do Y’, whereas with plain callbacks it is ‘do X then Y’.

Here are few common use cases for using events instead of plain callbacks:

  • Chat room where you want to broadcast messages to many listeners
  • Game server that needs to know when new players connect, disconnect, move, shoot and jump
  • Game engine where you want to let game developers subscribe to events like .on('jump', function() {})
  • A low level web server that wants to expose an API to easily hook into events that happen like .on('incomingRequest') or .on('serverError')

If we were trying to write a module that connects to a chat server using only callbacks it would look like this:

var chatClient = require('my-chat-client')

function onConnect() {
  // have the UI show we are connected
}

function onConnectionError(error) {
  // show error to the user
}

function onDisconnect() {
 // tell user that they have been disconnected
}

function onMessage(message) {
 // show the chat room message in the UI
}

chatClient.connect(
  'http://mychatserver.com',
  onConnect,
  onConnectionError,
  onDisconnect,
  onMessage
)

As you can see this is really cumbersome because of all of the functions that you have to pass in a specific order to the .connect function. Writing this with events would look like this:

var chatClient = require('my-chat-client').connect()

chatClient.on('connect', function() {
  // have the UI show we are connected
})

chatClient.on('connectionError', function() {
  // show error to the user
})

chatClient.on('disconnect', function() {
  // tell user that they have been disconnected
})

chatClient.on('message', function() {
  // show the chat room message in the UI
})

This approach is similar to the pure-callback approach but introduces the .on method, which subscribes a callback to an event. This means you can choose which events you want to subscribe to from the chatClient. You can also subscribe to the same event multiple times with different callbacks:

var chatClient = require('my-chat-client').connect()
chatClient.on('message', logMessage)
chatClient.on('message', storeMessage)

function logMessage(message) {
  console.log(message)
}

function storeMessage(message) {
  myDatabase.save(message)
}

Streams

Early on in the node project the file system and network APIs had their own separate patterns for dealing with streaming I/O. For example, files in a file system have things called ‘file descriptors’ so the fs module had to have extra logic to keep track of these things whereas the network modules didn’t have such a concept. Despite minor differences in semantics like these, at a fundamental level both groups of code were duplicating a lot of functionality when it came to reading data in and out. The team working on node realized that it would be confusing to have to learn two sets of semantics to essentially do the same thing so they made a new API called the Stream and made all the network and file system code use it.

The whole point of node is to make it easy to deal with file systems and networks so it made sense to have one pattern that was used everywhere. The good news is that most of the patterns like these (there are only a few anyway) have been figured out at this point and it is very unlikely that node will change that much in the future.

There are already two great resources that you can use to learn about node streams. One is the stream-adventure (see the Learn Node Interactively section) and the other is a reference called the Stream Handbook.

Stream Handbook

stream-handbook is a guide, similar to this one, that contains a reference for everything you could possibly need to know about streams.

stream-handbook

Modules

Node core is made up of about two dozen modules, some lower level ones like events and stream some higher level ones like http and crypto.

This design is intentional. Node core is supposed to be small, and the modules in core should be focused on providing tools for working with common I/O protocols and formats in a way that is cross-platform.

For everything else there is npm. Anyone can create a new node module that adds some functionality and publish it to npm. As of the time of this writing there are 34,000 modules on npm.

How to find a module

Imagine you are trying to convert PDF files into TXT files. The best place to start is by doing npm search pdf:

pdfsearch

There are a ton of results! npm is quite popular and you will usually be able to find multiple potential solutions. If you go through each module and whittle down the results into a more narrow set (filtering out things like PDF generation modules) you’ll end up with these:

A lot of the modules have overlapping functionality but present alternate APIs and most of them require external dependencies (like apt-get install poppler).

Here are some different ways to interpret the modules:

  • pdf2json is the only one that is written in pure JavaScript, which means it is the easiest to install, especially on low power devices like the raspberry pi or on Windows where native code might not be cross platform.
  • modules like mimeograph, hummus and pdf-extract each combine multiple lower level modules to expose a high level API
  • a lot of modules seem to sit on top of the pdftotext/poppler unix command line tools

Lets compare the differences between pdftotextjs and pdf-text-extract, both of which are are wrappers around the pdftotext utility.

pdf-modules

Both of these:

  • were updated relatively recently
  • have github repositories linked (this is very important!)
  • have READMEs
  • have at least some number of people installing them every week
  • are liberally licensed (anyone can use them)

Just looking at the package.json + module statistics it’s hard to get a feeling about which one might be the right choice. Let’s compare the READMEs:

pdf-readmes

Both have simple descriptions, CI badges, installation instructions, clear examples and instructions for running the tests. Great! But which one do we use? Let’s compare the code:

pdf-code

pdftotextjs is around 110 lines of code, and pdf-text-extract is around 40, but both essentially boil down to this line:

var child = shell.exec('pdftotext ' + self.options.additional.join(' '));

Does this make one any better than the other? Hard to say! It’s important to actually read the code and make your own conclusions. If you find a module you like, use npm star modulename to give npm feedback about modules that you had a positive experience with.

Modular development workflow

npm is different from most package managers in that it installs modules into a folder inside of other existing modules. The previous sentence might not make sense right now but it is the key to npm’s success.

Many package managers install things globally. For instance, if you apt-get install couchdb on Debian Linux it will try to install the latest stable version of CouchDB. If you are trying to install CouchDB as a dependency of some other piece of software and that software needs an older version of CouchDB, you have to uninstall the newer version of CouchDB and then install the older version. You can’t have two versions of CouchDB installed because Debian only knows how to install things into one place.

It’s not just Debian that does this. Most programming language package managers work this way too. To address the global dependencies problem described above there have been virtual environment developed like virtualenv for Python or bundler for Ruby. These just split your environment up in to many virtual environments, one for each project, but inside each environment dependencies are still globally installed. Virtual environments don’t always solve the problem, sometimes they just multiply it by adding additional layers of complexity.

With npm installing global modules is an anti-pattern. Just like how you shouldn’t use global variables in your JavaScript programs you also shouldn’t install global modules (unless you need a module with an executable binary to show up in your global PATH, but you don’t always need to do this – more on this later).

How require works

When you call require('some_module') in node here is what happens:

  1. if a file called some_module.js exists in the current folder node will load that, otherwise:
  2. node looks in the current folder for a node_modules folder with a some_module folder in it
  3. if it doesn’t find it, it will go up one folder and repeat step 2

This cycle repeats until node reaches the root folder of the filesystem, at which point it will then check any global module folders (e.g. /usr/local/node_modules on Mac OS) and if it still doesn’t find some_module it will throw an exception.

Here’s a visual example:

mod-diagram-01

When the current working directory is subsubfolder and require('foo') is called, node will look for the folder called subsubfolder/node_modules. In this case it won’t find it – the folder there is mistakenly called my_modules. Then node will go up one folder and try again, meaning it then looks for subfolder_B/node_modules, which also doesn’t exist. Third try is a charm, though, as folder/node_modules does exist and has a folder called foo inside of it. If foo wasn’t in there node would continue its search up the directory tree.

Note that if called from subfolder_B node will never find subfolder_A/node_modules, it can only see folder/node_modules on its way up the tree.

One of the benefits of npm’s approach is that modules can install their dependent modules at specific known working versions. In this case the module foo is quite popular - there are three copies of it, each one inside its parent module folder. The reasoning for this could be that each parent module needed a different version of foo, e.g. ‘folder’ needs foo@0.0.1, subfolder_A needs foo@0.2.1 etc.

Here’s what happens when we fix the folder naming error by changing my_modules to the correct name node_modules:

mod-diagram-02

To test out which module actually gets loaded by node, you can use the require.resolve('some_module') command, which will show you the path to the module that node finds as a result of the tree climbing process. require.resolve can be useful when double-checking that the module that you think is getting loaded is actually getting loaded – sometimes there is another version of the same module closer to your current working directory than the one you intend to load.

How to write a module

Now that you know how to find modules and require them you can start writing your own modules.

The simplest possible module

Node modules are radically lightweight. Here is one of the simplest possible node modules:

package.json:

{
  "name": "number-one",
  "version": "1.0.0"
}

index.js:

module.exports = 1

By default node tries to load module/index.js when you require('module'), any other file name won’t work unless you set the main field of package.json to point to it.

Put both of those files in a folder called number-one (the name in package.json must match the folder name) and you’ll have a working node module.

Calling the function require('number-one') returns the value of whatever module.exports is set to inside the module:

simple-module

An even quicker way to create a module is to run these commands:

mkdir my_module
cd my_module
git init
git remote add git@github.com:yourusername/my_module.git
npm init

Running npm init will create a valid package.json for you and if you run it in an existing git repo it will set the repositories field inside package.json automatically as well!

Adding dependencies

A module can list any other modules from npm or GitHub in the dependencies field of package.json. To install the request module as a new dependency and automatically add it to package.json run this from your module root directory:

npm install --save request

This installs a copy of request into the closest node_modules folder and makes our package.json look something like this:

{
  "id": "number-one",
  "version": "1.0.0",
  "dependencies": {
    "request": "~2.22.0"
  }
}

By default npm install will grab the latest published version of a module.

Client side development with npm

A common misconception about npm is that since it has ‘Node’ in the name that it must only be used for server side JS modules. This is completely untrue! npm actually stands for Node Packaged Modules, e.g. modules that Node packages together for you. The modules themselves can be whatever you want – they are just a folder of files wrapped up in a .tar.gz, and a file called package.json that declares the module version and a list of all modules that are dependencies of the module (as well as their version numbers so the working versions get installed automatically). It’s turtles all the way down - module dependencies are just modules, and those modules can have dependencies etc. etc. etc.

browserify is a utility written in Node that tries to convert any node module into code that can be run in browsers. Not all modules work (browsers can’t do things like host an HTTP server), but a lot of modules on NPM will work.

To try out npm in the browser you can use RequireBin, an app I made that takes advantage of Browserify-CDN, which internally uses browserify but returns the output through HTTP (instead of the command line – which is how browserify is usually used).

Try putting this code into RequireBin and then hit the preview button:

var reverse = require('ascii-art-reverse')

// makes a visible HTML console
require('console-log').show(true)

var coolbear =
  "    ('-^-/')  \n" +
  "    `o__o' ]  \n" +
  "    (_Y_) _/  \n" +
  "  _..`--'-.`, \n" +
  " (__)_,--(__) \n" +
  "     7:   ; 1 \n" +
  "   _/,`-.-' : \n" +
  "  (_,)-~~(_,) \n"

setInterval(function() { console.log(coolbear) }, 1000)

setTimeout(function() {
  setInterval(function() { console.log(reverse(coolbear)) }, 1000)
}, 500)

Or check out a more complicated example (feel free to change the code and see what happens):

requirebin

Going with the grain

Like any good tool, node is best suited for a certain set of use cases. For example: Rails, the popular web framework, is great for modeling complex business logic, e.g. using code to represent real life business objects like accounts, loan, itineraries, and inventories. While it is technically possible to do the same type of thing using node, there would be definite drawbacks since node is designed for solving I/O problems and it doesn’t know much about ‘business logic’. Each tool focuses on different problems. Hopefully this guide will help you gain an intuitive understanding of the strengths of node so that you know when it can be useful to you.

What is outside of node’s scope?

Fundamentally node is just a tool used for managing I/O across file systems and networks, and it leaves other more fancy functionality up to third party modules. Here are some things that are outside the scope of node:

Web frameworks

There are a number of web frameworks built on top of node (framework meaning a bundle of solutions that attempts to address some high level problem like modeling business logic), but node is not a web framework. Web frameworks that are written using node don’t always make the same kind of decisions about adding complexity, abstractions and tradeoffs that node does and may have other priorities.

Language syntax

Node uses JavaScript and doesn’t change anything about it. Felix Geisendörfer has a pretty good write-up of the ‘node style’ here.

Language abstraction

When possible node will use the simplest possible way of accomplishing something. The ‘fancier’ you make your JavaScript the more complexity and tradeoffs you introduce. Programming is hard, especially in JS where there are 1000 solutions to every problem! It is for this reason that node tries to always pick the simplest, most universal option. If you are solving a problem that calls for a complex solution and you are unsatisfied with the ‘vanilla JS solutions’ that node implements, you are free to solve it inside your app or module using whichever abstractions you prefer.

A great example of this is node’s use of callbacks. Early on node experimented with a feature called ‘promises’ that added a number of features to make async code appear more linear. It was taken out of node core for a few reasons:

  • they are more complex than callbacks
  • they can be implemented in userland (distributed on npm as third party modules)

Consider one of the most universal and basic things that node does: reading a file. When you read a file you want to know when errors happen, like when your hard drive dies in the middle of your read. If node had promises everyone would have to branch their code like this:

fs.readFile('movie.mp4')
  .then(function(data) {
    // do stuff with data
  })
  .error(function(error) {
    // handle error
  })

This adds complexity, and not everyone wants that. Instead of two separate functions node just uses a single callback function. Here are the rules:

  • When there is no error pass null as the first argument
  • When there is an error, pass it as the first argument
  • The rest of the arguments can be used for anything (usually data or responses since most stuff in node is reading or writing things)

Hence, the node callback style:

fs.readFile('movie.mp4', function(err, data) {
  // handle error, do stuff with data
})

Threads/fibers/non-event-based concurrency solutions

Note: If you don’t know what these things mean then you will likely have an easier time learning node, since unlearning things is just as much work as learning things.

Node uses threads internally to make things fast but doesn’t expose them to the user. If you are a technical user wondering why node is designed this way then you should 100% read about the design of libuv, the C++ I/O layer that node is built on top of.

License

CCBY

Creative Commons Attribution License (do whatever, just attribute me)
http://creativecommons.org/licenses/by/2.0/

Donate icon is from the Noun Project

“It wasn’t brains that got me here I can assure you of that.”

I cannot help rewatch this powerful scene from “Margin Call”. A masterclass in acting by the great Jeremy Irons.

Every sentence, glance, and gesture projects complete and menacing presence, power, and finality, and is done to absolute perfection 👌

Telugu Script Components

Telugu is a phonetic language, written from left to right, with each character representing generally a syllable. There are 52 letters in the Telugu alphabet: 16 Achchulu which denote basic vowel sounds, 36 Hallulu which represent consonants. In addition to these 52 letters, there are several semi-vowel symbols, called Maatralu, which are used in conjunction with Hallulu and, half consonants, called Voththulu, to form clusters of consonants.

Improved Symbol Segmentation for TELUGU Optical Character Recognition

The Iyers, The Iyengars, The Lowells, The Cabots, and God

This is the city of Madras
The home of the curry and the dal
Where Iyers speak only to Iyengars
And Iyengars speak only to God.

I’d read this years ago some place and forgot where. Thought it would be in some Religious Studies textbook back from when I was (briefly) a Religious Studies major. Nope! It was the great Paul Erdős!

Erdős said he’d modelled it after this ditty about the privileged New England families famously known as the ‘Boston Brahmins’.

This is good old Boston
The home of the bean and the cod
Where the Lowells speak to the Cabots
And the Cabots speak only to God.

From Vijaysree Venkataraman’s article.

Some Laws of Software Engineering

by GlobalNerdy

Amdahl’s Law

The speedup gained from running a program on a parallel computer is greatly limited by the fraction of that program that can’t be parallelized.

Augustine’s Second Law of Socioscience

For every scientific (or engineering) action, there is an equal and opposite social reaction.

Brooks’ Law

Adding manpower to a late software project makes it later.

Clarke’s First Law

When a distinguished but elderly scientist states that something is possible he is almost certainly right. When he states that something is impossible, he is very probably wrong.

Clarke’s Second Law

The only way of discovering the limits of the possible is to venture a little way past them into the impossible.

Clarke’s Third Law

Any sufficiently advanced technology is indistinguishable from magic.

Conway’s Law

Any piece of software reflects the organizational structure that produced it.

Cope’s Rule

There is a general tendency toward size increase in evolution.

Dilbert Principle

The most ineffective workers are systematically moved to the place where they can do the least damage: management.

Ellison’s Law of Cryptography and Usability

The userbase for strong cryptography declines by half with every additional keystroke or mouseclick required to make it work.

Ellison’s Law of Data

Once the business data have been centralized and integrated, the value of the database is greater than the sum of the preexisting parts.

The Law of False Alerts

As the rate of erroneous alerts increases, operator reliance, or belief, in subsequent warnings decreases.

Fisher’s Fundamental Theorem

The more highly adapted an organism becomes, the less adaptable it is to any new change.

Fitts’ Law

The time to acquire a target is a function of the distance to and the size of the target.

Flon’s Axiom

There does not now, nor will there ever, exist a programming language in which it is the least bit hard to write bad programs.

Gilder’s Law

Bandwidth grows at least three times faster than computer power.

Godwin’s Law

As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.

Grosch’s Law

The cost of computing systems increases as the square root of the computational power of the systems.

Hartree’s Law
Whatever the state of a project, the time a project-leader will estimate for completion is constant.

Heisenbug Uncertainty Principle

Most production software bugs are soft: they go away when you look at them.

Hick’s Law

The time to make a decision is a function of the possible choices he or she has.

Hoare’s Law of Large Programs

Inside every large problem is a small problem struggling to get out.

Hofstadter’s Law

A task always takes longer than you expect, even when you take into account Hofstadter’s Law.

Jakob’s Law of the Internet User Experience

Users spend most of their time on other sites. This means that users prefer your site to work the same way as all the other sites they already know.

Joy’s Law

smart(employees) = log(employees), or “No matter who you are, most of the smartest people work for someone else.”

Kerckhoffs’ Principle

In cryptography, a system should be secure even if everything about the system, except for a small piece of information — the key — is public knowledge.

Linus Torvalds

Given enough eyeballs, all bugs are shallow.

Lister’s Law

People under time pressure don’t think faster.

Metcalfe’s Law

In network theory, the value of a system grows as approximately the square of the number of users of the system.

Moore’s Law

The number of transistors on an integrated circuit will double in about 18 months.

Murphy’s Law

If there are two or more ways to do something, and one of those ways can result in a catastrophe, then someone will do it.

Nathan’s First Law

Software is a gas; it expands to fill its container.

Ninety-ninety Law

The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time.

Occam’s Razor

The explanation requiring the fewest assumptions is most likely to be correct.

Osborn’s Law

Variables won’t; constants aren’t.

Postel’s Law (the second clause of the Robustness Principle)

Be conservative in what you send, liberal in what you accept.

Pareto Principle (a.k.a. “The 80-20 Rule”)

For many phenomena, 80% of consequences stem from 20% of the causes.

Parkinson’s Law

Work expands so as to fill the time available for its completion.

Pesticide Paradox

Every method you use to prevent or find bugs leaves a residue of subtler bugs against which those methods are ineffectual.

The Peter Principle

In a hierarchy, every employee tends to rise to his level of incompetence.

Reed’s Law

The utility of large networks, particularly social networks, scales exponentially with the size of the network.

Rock’s Law

The cost of a semiconductor chip fabrication plant doubles every four years.

Sixty-sixty Rule

Sixty percent of software’s dollar is spent on maintenance, and sixty percent of that maintenance is enhancement.

Spector’s Law

The time it takes your favorite application to complete a given task doubles with each new revision.

Spafford’s Adoption Rule

For just about any technology, be it an operating system, application or network, when a sufficient level of adoption is reached, that technology then becomes a threat vector.

Sturgeon’s Revelation

Ninety percent of everything is crud.

Tesler’s Law of Conservation as Complexity

You cannot reduce the complexity of a given task beyond a certain point. Once you’ve reached that point, you can only shift the burden around.

Weibull’s Power Law

The logarithm of failure rates increases linearly with the logarithm of age.

Wirth’s Law

Software gets slower faster than hardware gets faster.

Zawinski’s Law

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

The Chess Set

The Chess Sets used at the World Chess Championships cost $350 (plus $700 if you want the electronic piece tracking), are likely out of stock if you’d like one, take a lot of training and practice to make, are woodworked in Amritsar, India, and were designed by Daniel Weil, a former partner at Pentagram.

About half the set’s value lies with the most difficult piece to make: the Knight ♘

Chess set designed by Michael Weil

Here’s Design Week on their conception. More on this Business Insider video.

Norm MacDonald at Iowa

Battling cancer for 9 years without telling anyone is the most Norm Macdonald shit ever.

Anthony Jeselnik

I had no idea that he performed at the Hancher Auditorium at The University of Iowa in 1997 and mortified most of a crowd of 1,200 excited kids and their parents who’d come to see him, Darrell Hammond, and Jim Breuer at the height of their SNL fame. The 250 or so who’d remained appear to have had a fantastic time:

"When Norm took the stage he immediately launched into a bit that was intentionally supposed to be offensive to most of the audience. As it went on, people got up and left in large numbers. Each time a new group would leave, he would make a remark like, ‘Did you think I was going to do airline jokes?’ or ‘Did you think I was going to hold up a picture of the Ayatollah and make a joke?’ Then he would double down on the dirty material to see how many more people he could drive out. It was clearly a game to see if he could empty out the place and after some time probably over two-thirds of the crowd had left.

[…] "Whoever booked the show comes in the green room all sweaty with his tie askew. He looked like he had seen a ghost. He says ‘You gotta go get him.’ Me, pull Norm off the stage? I’m not getting him. Breuer was laughing. He says, ‘I’m not getting him.’

"Me [Hammond] and Breuer knew something special was happening. He and I got chairs and sat on the side of the stage where nobody could see us. It was one of the most-brilliant shows I’ve ever seen.

"Anybody who thought Norm would change his act was sorely mistaken. Norm just didn’t care.

“He’s revered in the comedians’ world. He doesn’t bother with pretenses or correctness. He’s probably the original politically incorrect comedian. It’s not for a shock factor. It’s just who he is.”

Mike Hlas, “The night Norm Macdonald mortified the University of Iowa”, The Gazette (cached)

Here’s The Des Moines Register’s report:

Report on Norm MacDonald's performance at the University of Iowa in The Des Moines Register

Here’s Jim Breuer on the incident:

On Luxury

Danny Pudi keeping it real.

“Uh… a luxury you can’t live without.”
“A luxury I can’t live without… Coffee. I really like it.”
“Luxury… you can get it anywhere.”
“Ah I guess, yeah. Like good coffee…”
“I love coffee too.”
“I like nice socks.”
“Socks. Your socks you would put in your shoes.”
“Yeah. I really love them. I like kind of like you know, cozy feet.”
“You’re attracted to your socks.”
“I’m attracted to really nice running socks. Like I’m always looking for good running…”
“That’s not a luxury, though. Coffee and socks are not a luxury all.”
“Alright give me a luxury. What luxury should I have?”
“Private plane.”
“Larry. I’m on Duck Tales.”

Mersal

Mersal (2017)

IMDb

Rating: C

Saw with NN. At least twice as long as it needs to be. Didn’t care about the score. It’s three hours of Vijay doing Vijay things with gusto. Spoiler: I understand that mass Indian entertainers, particularly the South Indian ones, have a tenuous relationship with reality. But we are to be OK with two siblings, born five years apart, looking like facsimiles of each other. They didn’t even bother shaving the mustache of the younger bro. Come the fuck on.

Candyman

Candyman (2021)

IMDb

Rating: B+

Saw with BE and NN. Eh. Clear messages about creatives’ struggles and temptations, and the importance of continuing to tell past and present stories of horrific pain and suffering.

I suppose I just lazily wanted to watch a well-made scary movie without actively engaging with it, without searching for the clever and occasionally deep symbolism that has come to characterize a movie with Jordan Peele’s name on it. It was adequately scary.

The title of Anthony’s piece [“Say His Name”] also is recognizable as a play on the Say Her Name slogan meant to memorialize victims of anti-Black violence and police brutality such as Breonna Taylor and Sandra Bland. The recognition of that inference is the only point of connection to it.

Beyond that, little about the plot makes a statement about over-policing or the socio-economic violence that gentrification creates by destroying and displacing low-income communities. Its characters blithely discuss these concerns over drinks or Brianna’s well-appointed living room, but only as part of a litany of urban ills. The sequences are the film’s ways of throwing a message that’s on-brand for 2021 behind a horror movie meant to speak to an audience that supports protests against racial injustice and biased policing in principle without having any actual skin in the game.

To those impacted in a real way by these issues or savvy enough to recognize when they’re being used as mechanisms to impart a sense of relevance, they come across as didactic nonsense. All that noise strangles the twin melodies that make up the Candyman character’s siren song: seduction and legacy.

Melanie McFarland, “No sweets for the sweet in new “Candyman,” which neglects the legend’s seductively scary legacy”, Salon

Speaking of these “twin melodies”: I haven’t seen the 1992 original and it’s on my list. Didn’t know that Philip Glass did the score for the movie.

A Big Collection of Bog Bodies

by Gabe Paoletti

In many cases, you’re staring at the face of someone who lived centuries ago. That was their hair, their nose, their eye-lashes, their sleep. Very few things are more fascinating than this.

Borremose Man

The Borremose Man died in the 7th century BCE. He was bludgeoned to death from the back of his head and had a rope with a slip knot tied around his neck. It is believed that he was a human sacrifice. He was found in the Borremose peat bog in Himmerland, Denmark in 1946. Shortly after, two other, less well preserved, bodies were discovered in the same marsh. Credit: Danish National Museum/Wikimedia Commons

Tollund Man

The face of the Tollund Man. Credit: Sven Rosborn/Wikimedia Commons

Yde Girl

The Yde Girl died sometime between 54 BCE and 128 CE at an approximate age of 16 years old. She suffered from scoliosis and had long reddish blonde hair that was preserved by the swamp. She was buried with a ritually tied woolen braid around her neck suggesting she was killed as a human sacrifice. However, due to damage to the body at the time of discovery, the cause of her death is unknown. She was found outside the village of the village of Yde, Netherlands. Credit: Drents Museum/Wikimedia Commons

Grauballe Man

The Grauballe Man died during the late 3rd century BCE when he was around thirty years old. He was found naked, with no indication of any clothing around him. His neck was slit from ear-to-ear in a bog in Jutland, Denmark in 1955. His well-preserved hair was likely dark brown during his life but was turned red by the bog. Historians believe he was likely a human sacrifice. Credit: Sven Rosborn/Wikimedia Commons

Tollund Man

The Tollund Man was an approximately 40-year-old man who was killed sometime between 375 and 210 BCE. He was found with a noose around his neck, indicating he was hanged to death, as well as a sheepskin cap on his head. He was found in a bog outside of the Danish town of Silkeborg in 1950. Credit: Wikimedia Commons

Damendorf Man

The Damendorf Man died around 300 BCE and had his body squashed flat by the weight of the peat that accumulated on top of him. He was found in a bog outside the German town of Damendorf in 1900 with a leather belt, shoes, and a pair of breeches. Credit: Bullenwächter/Wikimedia Commons

Bocksten Bog

The Bocksten Man likely lived sometime between 1290 and 1430. He was a tall, slender man, most likely in his 40s at the time of his death. He was killed and impaled with two wooden poles, one that went directly through his heart, to the bed of a lake that would later become a bog. This impaling likely happened after his death as he also has a large wound on his head. He was found in a bog near Varberg Municipality, Sweden in 1936. His hair was found perfectly preserved, and he was also discovered with a hooded garment and an engraved leather sheath. Credit: Peter Lindberg/Wikimedia Commons

Arden Hair

The Arden Woman lived during the 14th Century BCE and was around 20–25 years old at the time of her death. She was found in the Bredmose bog in Hindsted, Denmark in 1942. Police said the corpse was found in a ‘question mark’ shape. Her well-preserved hair was dark blond, drawn into two pigtails, and coiled around the top of her head. Unlike some bog bodies, she was found with garments and with no evidence of a violent death. Credit: P.V. Glob/Wikimedia Commons

Grauballe Man

The full body of The Grauballe Man. His hands were so well preserved that researchers were able to take the fingerprints of the over 2,000-year-old body. Credit: Colin/Wikimedia Commons

Bog Bodies

The Clonycavan Man was an Irish man who died sometime between 392 BCE and 201 BCE. He was 5’2, with a squashed nose, crooked teeth, and gelled-up hair. He was killed by an ax blow to the back of his head. The Clonycavan Man was discovered in 2003 in Clonycavan, Ireland when he was picked up by a modern peat harvesting machine that mangled his lower body. His rich diet, imported hair gel, and death near a hill used for kingly initiation led historians to theorize that he was a king who was ritually sacrificed after a bad harvest. Credit: Mark Healey/Wikimedia Commons

Kreepen Man

The Kreepen Man was a body discovered in a bog in 1903 near Verden, Germany. The body had twisted oak and willow branches binding his hands and feet. After its discovery, the body was sold to The Museum of European Cultures in Berlin but was destroyed when the city was bombed during WWII. Hair found at the site believed to belong to the Kreepen Man, date to between 1440 and 1520, but without the body, the genuine date of death is unknown. Credit: Andreas Franzkowiak/Wikimedia Commons

huldremosekvinden-bog

The Huldremose Woman died sometime between 160 BCE and 340 CE and was over 40 years old at the time of her death. She had a rope around her neck indicating she may have been strangled or hanged to death. There is also a laceration on one of her feet. She was found with an elaborate wool plaid cape, scarf, and skirt. She was found by a school teacher in 1879 in a peat bog near Ramten, Denmark. Credit: Kira Ursem/Wikimedia Commons

Weerdinge Men

The Weerdinge Men are two naked bog bodies found in Drenthe, the Netherlands in 1904. They would have lived sometime between 60 BCE and 220 CE. One of the men had a large cut in his abdomen, through which his intestines spilled out, which some historians believe indicates that he was cut open so an ancient druid could divine the future from his entrails. Credit: Wikimedia Commons

Röst Girl

The Röst Girl is thought have died sometime between 200 BCE and 80 CE in a bog in the Schleswig-Holstein state of Germany. She was discovered in 1926, but the cause of her death is unknown because her body was destroyed during WWII. Credit: Wikimedia Commons

Old Croughan

The Old Croughan Man lived sometime between 362 BCE and 175 BCE and would have been around 20-years-old at the time of his death. This torso, missing the head and lower body, was discovered in 2003 in a bog near Croghan Hill in Ireland. From his arm-span, it is believed he would have been 6’6. Credit: Mark Healey/Wikimedia Commons

Roter Franz

Roter Franz died in the Bourtanger Moor, on what is now the border of Germany and the Netherlands, sometime between 220 and 430 CE during the Roman Iron Age. The name Roter Franz (meaning Red Franz in English) is derived from the red hair and beard discovered on the body. He was killed when his throat was slit and had an arrow wound on his shoulder. Credit: Axel Hindemith/Wikimedia Commons

Bog Bodies

The Osterby Head was discovered in 1948 in a bog to the southeast of Osterby, Germany. The man whose head this belonged to lived sometime between 75 and 130 CE and was 50 to 60 years of age when he died. Evidence shows that he was struck in the head fatally and then beheaded. His hair was tied in a Suebian knot, indicating he was likely a free man of the Germanic Suebi tribe. Credit: Andreas Franzkowiak/Wikimedia Commons

Kraglund Man

The Kraglund Man was discovered in 1898 in Nordjylland, Denmark. He is believed to have been male, but there is little documentation, and the body has been lost. He was the first bog body to be photographed before being moved from where it was discovered. Credit: Georg Sarauw /Wikimedia Commons

Rendswühren Man

The Rendswühren Man was a 40 to 50 years old man who died in the 1st century CE. He is believed to have been beaten to death and was buried with his clothing, a rectangular wool cloak, and a fur cape. He was discovered outside the town of Rendswühren in Germany in 1871. Credit: Andreas Franzkowiak/Wikimedia Commons

Rendswühren Man

A picture of the Rendswühren Man taken in 1873, two years after he was discovered. Credit: Johanna Mestorf/Wikimedia Commons

Roum Head

The Roum Head was found in Himmerland, Denmark, and belonged to a man in his 20s who died during the Iron Age. The find was originally titled as “The Roum Woman” until traces of beard stubble were found on the face. Credit: Wikimedia Commons

Haraldskær Woman

The Haraldskær Woman was discovered in a bog in Jutland, Denmark in 1892. When she was discovered, she was believed to be Queen Gunnhild of Norway, a quasi-historical figure from around 1000 CE who was said to have been drowned in a bog. Thinking it was their ancient queen, the Danish monarchy had the body placed in an elaborate glass-covered sarcophagus inside St. Nicolai Church in central Vejle, Denmark. In 1977, radiocarbon dating proved that the woman actually lived nearly 1,500 years before the revered queen, and likely died in the 5th century BC. She was around 40 years old at the time of her death. Credit: McLeod/Wikimedia Commons

Gunhild Glass

The Haraldskær Woman in her glass-covered sarcophagus. Credit: Västgöten/Wikimedia Commons

Kayhausen Boy

The Kayhausen Boy was a child aged 7 to 10 years old who is thought to have been killed died between 300 and 400 BCE. He had an infected socket at the top of his femur that would likely have made him unable to walk. His killers bound his hands and feet with cloth torn from a fur cape and stabbed him four times. His body was discovered in a sphagnum bog in Lower Saxony, Germany in 1922. Credit: Department of Legal Medicine, Universitatsklinikum Hamburg-Eppendorf

An Annoyed, Shivering, Nude Woman with Large Lapis Lazuli Glasses

Carved by someone in Ancient Egypt between 3700–3500 BCE.

Bone figure of a woman c. 3700–3500

[…] most of them represent nude females with their feminine attributes emphasised by carving and careful drilling. With their slim figures, narrow waists and full hips they present an ideal of the female body that will change little over the course of Ancient Egyptian civilization. Their enduring concept of beauty also included a full head of hair (the bald ones may have had wigs) as well as large and alluring eyes.

Google Arts and Culture

On Spite

In his book Dying of Whiteness, Metzl told of the case of a forty-one-year-old white taxi driver who was suffering from an inflamed liver that threatened the man’s life. Because the Tennessee legislature had neither taken up the Affordable Care Act nor expanded Medicaid coverage, the man was not able to get the expensive, lifesaving treatment that would have been available to him had he lived just across the border in Kentucky. As he approached death, he stood by the conviction that he did not want the government involved. “No way I want my tax dollars paying for Mexicans or welfare queens,” the man told Metzl. “Ain’t no way I would ever support Obamacare or sign up for it. I would rather die.” And sadly, so he would.

Isabel Wilkerson, “Caste: The Origins of Our Discontents

Now,

You might wish to let that simmer for a few minutes. With his health as shaky as a Jenga tower, with his very life ebbing away, Trevor’s greater concern – his greater fear – was of undeserving “Mexicans or welfare queens” benefiting from his taxes, however much that might be on the wages of a used-to-be cab driver eking out his last days in a low-income housing facility.

If that’s sad and ridiculous – and it is both – it is also predictable. From the beginning, white fear has been a great, unspoken driver of this nation’s sins against difference. So Trevor is just a link in an unbroken line that binds Lincoln fretting about retribution from newly freed slaves, to Roosevelt worrying about treachery from Americans of Japanese heritage, to Trump seeing terrorism in brown-skinned toddlers on the southern border.

Decade after decade, election after election, so much of the white conservative appeal is an implicit promise to defend whiteness from blacks and browns. Metzl argues that white people themselves have borne and are bearing a terrific cost for this “defense,” that they are, in effect, killing themselves.

Leonard Pitts, “Dying of Whiteness

Paraphrasing a comment I read on Instagram: “You will let your Orange Highness shit on your head if it means that the liberal standing next to you has to smell it.”

A Hundred Humans

Allysson Lucca is a Brazilian designer who took this original list (cached) of what the world would look like with a 1,000 people and shrank it to a hundred.

The idea of reducing the world’s population to a community of only 100 people is very useful and important. It makes us easily understand the differences in the world. There are many types of reports that use the Earth’s population reduced to 100 people, especially in the Internet. Ideas like this should be more often shared, especially nowadays when the world seems to be in need of dialogue and understanding among different cultures, in a way that it has never been before.

Transcribed from the graphic:

  • 50 men, 50 women
  • 61 Asians, 14 Africans, 14 People from the Americas, 11 Europeans
  • 33 Christians, 22 Muslims, 14 Hindus, 7 buddhists, 12 People who practice other religions, and 12 people not aligned with a religion
  • Only 7 would have a college degree
  • 51 would live in urban areas
  • 14 people live with some disability
  • 15 would be undernourished
  • 37 of the community’s population still lack access to adequate sanitation
  • The village military expenditure is $1.7 trillion and only $18 billion in humanitarian assistance
  • 20 people own 75% of the village income
  • 30 would be active internet users
  • 48 would love on less than $2 per day and 80 on less than $10 a day

State of the Village Report

by Donella Meadows

“If the world were a village of 1000 people.” This was written in 1990.

If the world were a village of 1000 people:

  • 584 would be Asians
  • 123 would be Africans
  • 95 would be East and West Europeans
  • 84 Latin Americans
  • 55 Soviets (still including for the moment Lithuanians, Latvians, Estonians, etc.)
  • 52 North Americans
  • 6 Australians and New Zealanders

The people of the village would have considerable difficulty communicating:

  • 165 people would speak Mandarin
  • 86 would speak English
  • 83 Hindi/Urdu
  • 64 Spanish
  • 58 Russian
  • 37 Arabic
  • That list accounts for the mother-tongues of only half the villagers. The other half speak (in descending order of frequency) Bengali, Portuguese, Indonesian, Japanese, German, French, and 200 other languages.

In the village there would be:

  • 300 Christians (183 Catholics, 84 Protestants, 33 Orthodox)
  • 175 Moslems
  • 128 Hindus
  • 55 Buddhists
  • 47 Animists
  • 210 all other religons (including atheists)
  • One-third (330) of the people in the village would be children. Half the children would be immunized against the preventable infectious diseases such as measles and polio.

Sixty of the thousand villagers would be over the age of 65.

Just under half of the married women would have access to and be using modern contraceptives.

Each year 28 babies would be born.

Each year 10 people would die, three of them for lack of food, one from cancer. Two of the deaths would be to babies born within the year.

One person in the village would be infected with the HIV virus; that person would most likely not yet have developed a full-blown case of AIDS.

With the 28 births and 10 deaths, the population of the village in the next year would be 1018. In this thousand-person community, 200 people would receive three-fourths of the income; another 200 would receive only 2% of the income. Only 70 people would own an automobile (some of them more than one automobile).

About one-third would not have access to clean, safe drinking water. Of the 670 adults in the village half would be illiterate. The village would have 6 acres of land per person, 6000 acres in all of which:

  • 700 acres is cropland
  • 1400 acres pasture
  • 1900 acres woodland
  • 2000 acres desert, tundra, pavement, and other wasteland.
  • The woodland would be declining rapidly; the wasteland increasing; the other land categories would be roughly stable. The village would allocate 83 percent of its fertilizer to 40 percent of its cropland — that owned by the richest and best-fed 270 people. Excess fertilizer running off this land would cause pollution in lakes and wells. The remaining 60 percent of the land, with its 17 percent of the fertilizer, would produce 28 percent of the foodgrain and feed 73 percent of the people. The average grain yield on that land would be one-third the yields gotten by the richer villagers.

If the world were a village of 1000 persons, there would be five soldiers, seven teachers, one doctor. Of the village’s total annual expenditures of just over $3 million per year, $181,000 would go for weapons and warfare, $159,000 for education, $132,000 for health care.

The village would have buried beneath it enough explosive power in nuclear weapons to blow itself to smithereens many times over. These weapons would be under the control of just 100 of the people. The other 900 people would be watching them with deep anxiety, wondering whether the 100 can learn to get along together, and if they do, whether they might set off the weapons anyway through inattention or technical bungling, and if they ever decide to dismantle the weapons, where in the village they will dispose of the dangerous radioactive materials of which the weapons are made.

Afghanistan

  • 47,245 Civilians Killed
  • 2,442 US Troops Killed
  • 20,666 US Troops Wounded
  • 66,000 - 69,000 Afghan Troops Killed
  • $2.26 Trillion Taxpayer Dollars

Via NPR. And then:

Just days before, Pardis had confided to his friend that he was receiving death threats from the Taliban, who had discovered he had worked as a translator for the United States Army for 16 months during the 20-year-long conflict.

“They were telling him you are a spy for the Americans, you are the eyes of the Americans and you are infidel, and we will kill you and your family,” his friend and co-worker Abdulhaq Ayoubi told CNN.

As he approached the checkpoint, Pardis put his foot on the accelerator to speed through. He was not seen alive again.

Afghan interpreter for US Army was beheaded by Taliban. Others fear they will be hunted down too”, CNN

What a nightmare, twenty years on. And it’s not like the powers that be didn’t know what they were getting into. Heck, here’s a scene from Rambo (via WN)

Whence “Gubernatorial”?

I’m put off by the word “gubernatorial” whenever I see it. Seems very silly, saccharine, like something a 5-year old mispronounced in 1953 that just stuck because it was so cute 🙄

Nope.

“Because, if you go back to where this word came from, in the original Latin, it’s from the verb, gubernare and gubernator, one who governs,” [Lisa McLendon, professor, University of Kansas School of Journalism] says.

Then, “governor, with the ‘v,’ came into English from French in about the 14th century,” she says. "French had taken the Latin and they swapped the ‘b’ for a ‘v.’ "

English speakers went back to the “b” about 400 years later, but just for gubernatorial. And, there’s the split.

Where Does The Term ‘Gubernatorial’ Come From?, NPR

Today I Learned that the World Record for Walking A Mile is 5:31:08

It belongs to Tom Bosworth. I thought “I cannot even run that fast” and decided to look at the video.

While it certainly does look like they’re running, there are some severe restrictions on their movement, of course, else it’d be an event called ‘Trotting’. The rules are:

  • One foot must touch the ground at all times
  • The leading leg must be straight when the foot touches the ground
  • It must remain straight until it passes under the torso

Judges look out any infractions by eye (no technology) and disqualify people appropriately.