Dil Se
A most beautiful, graceful, and sublime rendition of one of my all-time favorite songs.
Lovely, lovely stuff 🙏❤️ (via Deepu)
See also: Worlds Beyond the Stars
A most beautiful, graceful, and sublime rendition of one of my all-time favorite songs.
Lovely, lovely stuff 🙏❤️ (via Deepu)
See also: Worlds Beyond the Stars
Responding to this chilling comment:
You are failing to understand genocide itself. INTENT, is the word, DELIBERATION. Deliberation to destroy an ethnic group. There was NEVER a deliberate attempt to destroy native culture in the Americas. In fact, you have laws since the 1512 protecting their rights and equalising them to Iberian Crown subjects, “Las Leyes de Burgos”.
Because, you see, unintentional genocide is A-OK.
I see I’ve been summoned. Your comments in this thread make it clear that nothing will change your position. It’s a difficult position to combat, because it’s in such a defiance of literally anything written on the topic in at least the last 50 years. You are not operating off the same foundations of evidence that others are, and for that reason I suspect they, like me, are not terribly interested in arguing. Because it’s unlikely your drivel will be removed, I’m posting some quotes and links for those who see this thread later and think you might have even begun to approach a point supported by any specialist on the topic. I do not intend these to be comprehensive; there are myriad examples of “deliberate attempts to destroy native culture in the Americas” in, well, literally any single book or article you can pick up about the era. Rather, because you’ve instead there never was any such thing, I’ve provided some obvious examples.
A primary goal of the Spanish colonial regime was to completely extirpate indigenous ways of life. While this was nominally about conversion to Catholicism, those in charge made it quite explicit that “conversion” not only should be but needed to be a violent process. Everything potentially conceivable as an indigenous practice, be it burial rituals, ways to build houses, or farming technologies, was targeted, To quote historian Peter Gose:
only by rebuilding Indian life from the ground up, educating, and preventing (with force if necessary) the return to idolatry could the missionary arrest these hereditary inclinations and modify them over time.
Francisco de Toledo, Viceroy of Peru, made clear in a 1570 decree that failure to comply with Catholicism was an offense punishable by death and within secular jurisdiction:
And should it occur that an infidel dogmatizer be found who disrupts the preaching of the gospel and manages to pervert the newly converted, in this case secular judges can proceed against such infidel dogmatizers, punishing them with death or other punishments that seem appropriate to them, since it is declared by congresses of theologians and jurists that His Majesty has convened in the Kingdoms of Spain that not only is this just cause for condemning such people to death, but even for waging war against a whole kingdom or province with all the death and damage to property that results
The same Toledo decreed in 1580 that Catholic priests and secular judges and magistrates should work together to destroy indigenous burial sites:
I order and command that each magistrate ensure that in his district all the tower tombs be knocked down, and that a large pit be dug into which all of the bones of those who died as pagans be mixed together, and that special care be taken henceforth to gather the intelligence necessary to discover whether any of the baptized are buried outside of the church, with the priest and the judge helping each other in such an important matter
Not only was the destruction of native culture a top-down decree, resistance was explicitly a death sentence.
The contemporary diversity of Latin America is not the result of natural “intermixing,” but the failure of the Spanish to assert themselves and the continuous resistance of the indigenous population. As early as 1588, we see letters from local priests airing grievances about the failure of the reduccion towns they were supposed to relocate native families to:
‘the corregidores are obliged, and the governors, to reduce the towns and order them reduced, and to build churches, take care to find out if the people come diligently for religious instruction and mass, to make them come and help the priest, and punish the careless, lazy, and bad Indians in the works of Christianity, as the ordinances of don Francisco de Toledo require, [but] they do not comply. Rather, many of the towns have yet to be reduced, and many churches are yet to be built, and a large part of the Indians are fled to many places where they neither see a priest nor receive religious instruction.
Reduccion was not a voluntary process, nor was it a question of simply “moving away.” Not only did it involve the destruction of native religious sites, it frequently involved the destruction of entire towns to repurpose building material and ensure people could not return. In fact, where we do see more voluntary participation in Spanish colonial structures, usually because of the political legibility and opportunities it provided, the resulting syncretism becomes an ever greater source of anxiety for the Spanish. Indigenous elites could selectively participate in Catholicism and game the system to their benefit- not something the state wanted to admit could happen.
These quotes come from Gose’s chapter on reducciones uploaded here.
I will also provide this section from the conclusion of Nicholas Robins’ book Mercury, Mining, and Empire; the entirety is uploaded here. The quoted chunk below is a summary of the various historical events presented in that chapter.
The white legend held much historiographical sway throughout the nineteenth and much of the twentieth centuries, and in no small part reflected a selective focus on legal structures rather than their application, subsumed in a denigratory view of native peoples, their cultures, and their heritage. As later twentieth-century historians began to examine the actual operation of the colony, the black legend again gained ascendance. As Benjamin Keen wrote, the black legend is “no legend at all.
Twentieth-century concepts of genocide have superseded this debate, and the genocidal nature of the conquest is, ironically, evident in the very Spanish laws that the advocates of the white legend used in their efforts to justify their position. Such policies in Latin America had a defining influence on Rafael Lemkin, the scholar who first developed the term genocide in Axis Rule in Occupied Europe. As developed by Lemkin, “Genocide has two phases: one, destruction of the national pattern of the oppressed group; the other, the imposition of the national pattern of the oppressor,” which often included the establishment of settler colonies. Because of the intimate links between culture and national identity, Lemkin equated intentional cultural destruction with genocide. It was in no small part a result of his tireless efforts that in 1948 the United Nations adopted the defintion of genocide which, despite its shortcomings, serves today as international law. The fact that genocide is a modern concept and that colonists operated within the “spirit of the times” in no way lessens the genocidal nature of their actions. It was, in fact, historical genocides, including those in Latin America, that informed Lemkin’s thinking and gave rise to the term.
Dehumanization of the victim is the handmaiden of genocide, and that which occurred in Spanish America is no exception. Although there were those who recognized the humanity of the natives and sought to defend them, they were in the end a small minority. The image of the Indian as a lazy, thieving, ignorant, prevaricating drunkard who only responded to force was, perversely, a step up from the ranks of nonhumans in which they were initially cast. The official recognition that the Indians were in fact human had little effect in their daily lives, as they were still treated like animals and viewed as natural servants by non-Indians. It is remarkable that the white legend could ever emerge from this genocidogenic milieu. With the path to genocide thus opened by the machete of dehumanization, Spanish policies to culturally destroy and otherwise subject the Amerindians as a people were multifaceted, consistent, and enduring. Those developed and implemented by Viceroy Francisco de Toledo in Peru in the 1570s have elevated him to the status of genocidier extraordinaire.
Once an Indian group had refused to submit to the Spanish crown, they could be legally enslaved, and calls for submission were usually made in a language the Indians did not understand and were often out of earshot. In some cases, the goal was the outright physical extermination or enslavement of specific ethnic groups whom the authorities could not control, such as the Chiriguano and Araucanian Indians. Another benefit from the crown’s perspective was that restive Spaniards and Creoles could be dispatched in such campaigns, thus relieving cities and towns of troublemakers while bringing new lands and labor into the kingdom. Ironically, de Toledo’s campaign to wipe out the Chiriguano contributed to his own ill health. Overall, however, genocidal policies in the Andes and the Americas centered on systematic cultural, religious, and linguistic destruction, forced labor, and forced relocation, much of which affected reproduction and the ability of individuals and communities to sustain themselves.
The forced relocation of Indians from usually spread-out settlements into reducciones, or Spanish-style communities, had among its primary objectives the abolition of indigenous religious and cultural practices and their replacement with those associated with Catholicism. As native lands and the surrounding geographical environment had tremendous spiritual significance, their physical removal also undermined indigenous spiritual relationships. Complementing the natives’ spiritual and cultural control was the physical control, and thus access to labor, offered by the new communities. The concentration of people also inadvertently fostered the spread of disease, giving added impetus to the demographic implosion. Finally, forced relocation was a direct attack on traditional means of sustenance, as many kin groups settled in and utilized the diverse microclimates of the region to provide a variety of foodstuffs and products for the group.
Integrated into this cultural onslaught were extirpation campaigns designed to seek out and destroy all indigenous religious shrines and icons and to either convert or kill native religious leaders. The damage matched the zeal and went to the heart of indigenous spiritual identity. For example, in 1559, an extirpation drive led by Augustinian friars resulted in the destruction of about 5,000 religious icons in the region of Huaylas, Peru, alone. Cultural destruction, or ethnocide, also occurred on a daily basis in Indian villages, where the natives were subject to forced baptism as well as physical and financial participation in a host of Catholic rites. As linchpins in the colonial apparatus, the clergy not only focused on spiritual conformity but also wielded formidable political and economic power in the community. Challenges to their authority were quickly met with the lash, imprisonment, exile, or the confiscation of property.
Miscegenation, often though not always through rape, also had profound personal, cultural, and genetic impacts on indigenous people. Part of the reason was the relative paucity of Spanish women in the colony, while power, opportunity, and impunity also played important roles. Genetic effacement was, in the 1770s, complemented by efforts to illegalize and eliminate native languages. A component in the wider effort to deculturate the indigenes, such policies were implemented with renewed vigor following the Great Rebellion of 1780–1782. Such laws contained provisions making it illegal to communicate with servants in anything but Spanish, and any servant who did not promptly learn the language was to be fired. The fact that there are still Indians in the Andes does not diminish the fact that they were victims of genocide, for few genocides are total.
Lastly, I would direct readers to the following article: Levene, Mark. 1999. “The Chittagong Hill Tracts: A Case Study in the Political Economy of ‘Creeping’ Genocide.” Third World Quarterly 20 (2): 339–69.
Though it talks about events a world away, it’s discussion of genocide is pertinent here. From the abstract:
The destruction of indigenous, tribal peoples in remote and/or frontier regions of the developing world is often assumed to be the outcome of inexorable, even inevitable forces of progress. People are not so much killed, they become extinct. Terms such as ethnocide, cultural genocide or developmental genocide suggest a distinct form of ‘off the map’ elimination which implicitly discourages comparison with other acknowledged examples of genocide. By concentrating on a little-known case study, that of the Chittagong Hill Tracts (CHT) in Bangladesh, this article argues that this sort of categorisation is misplaced. Not only is the destruction or attempted destruction of fourth world peoples central to the pattern of contemporary genocide but, by examining such specific examples, we can more clearly delineate the phenomenon’s more general wellsprings and processes. The example of the CHT does have its own peculiar features; not least what has been termed here its ‘creeping’ nature. In other respects, however, the efforts of a new nation-state to overcome its structural weaknesses by attempting a forced-pace consolidation and settlement of its one, allegedly, unoccupied resource-rich frontier region closely mirrors other state-building, developmental agendas which have been confronted with communal resistance. The ensuing crisis of state–communal relations, however, cannot be viewed in national isolation. Bangladesh’s drive to develop the CHT has not only been funded by Western finance and aid but is closely linked to its efforts to integrate itself rapidly into a Western dominated and regulated international system. It is in these efforts ‘to realise what is actually unrealisable’ that the relationship between a flawed state power and genocide can be located.
Genocide need not be a state program uniquely articulated to eliminate a people or their culture. Rather, it is often disguised in the name “progress” or “development.” This connects to the Spanish colonial economic system, based on what Robins (above) calls the “ultra-violence” of forced labor in mines.
I love this ring as much as I love Mr. Stanley Tucci.
It’s from The Devil Wears Prada and is an aqeeq ring (which “means quartz in Arabic, and agate in Turkish”) which was a pretty common sight on older hands when I was growing up.
I looked far and wide for a replica and found this on Etsy1. That price aside, not bad at all!
Here’s a picture in case that listing gets removed.↩︎
Eighth Wonder of the World indeed. Here’s a nice calculator that draws graphs, and allows for monthly contributions and rate variances.
The Dune screenplay was written on MS-DOS on a program app called “Movie Master”. It has a 40 page limit which helps the writer, Eric Roth.
In case that site is unavailable, and for the year 2020, it’s an exponential curve with the
Until not-too-long ago, I used to think that “the top 1%” referred to “few hundreds of millions”-millionaries or billionaires 🤷♂️
Old stuff, but always immensely satisfying to read ♥️
by the wonderful Amii James (Insta)
A fantastic introduction to Node (for maybe someone coming in from Python or Ruby land.)
This document is intended for readers who know at least a little bit of a couple of things:
In addition to reading this guide it’s super important to also bust out your favorite text editor and actually write some node code. I always find that when I just read some code in a book it never really clicks, but learning by writing code is a good way to grasp new programming concepts.
NodeSchool.io is a series of free + open source interactive workshops that teach you the principles of Node.js and beyond.
Learn You The Node.js is the introductory NodeSchool.io workshop. It’s a set of programming problems that introduce you to common node patterns. It comes packaged as a command line program.
You can install it with npm:
# install
npm install learnyounode -g
# start the menu
learnyounode
Node.js is an open source project designed to help you write JavaScript programs that talk to networks, file systems or other I/O (input/output, reading/writing) sources. That’s it! It is just a simple and stable I/O platform that you are encouraged to build modules on top of.
What are some examples of I/O? Here is a diagram of an application that I made with node that shows many I/O sources:
If you don’t understand all of the different things in the diagram it is completely okay. The point is to show that a single node process (the hexagon in the middle) can act as the broker between all of the different I/O endpoints (orange and purple represent I/O).
Usually building these kinds of systems is either:
Node’s goal is to strike a balance between these two: relatively easy to understand and use and fast enough for most use cases.
Node isn’t either of the following:
Instead, node is somewhere in the middle. It is:
At a lower level, node can be described as a tool for writing two major types of programs:
What is an “I/O based program”? Here are some common I/O sources:
Node does I/O in a way that is asynchronous which lets it handle lots of different things simultaneously. For example, if you go down to a fast food joint and order a cheeseburger they will immediately take your order and then make you wait around until the cheeseburger is ready. In the meantime they can take other orders and start cooking cheeseburgers for other people. Imagine if you had to wait at the register for your cheeseburger, blocking all other people in line from ordering while they cooked your burger! This is called blocking I/O because all I/O (cooking cheeseburgers) happens one at a time. Node, on the other hand, is non-blocking, which means it can cook many cheeseburgers at once.
Here are some fun things made easy with node thanks to its non-blocking nature:
Firstly I would recommend that you get node installed on your computer. The easiest way is to visit nodejs.org and click Install
.
Node has a small core group of modules (commonly referred to as ‘node core’) that are presented as the public API that you are intended to write programs with. For working with file systems there is the fs
module and for networks there are modules like net
(TCP), http
, dgram
(UDP).
In addition to fs
and network modules there are a number of other base modules in node core. There is a module for asynchronously resolving DNS queries called dns
, a module for getting OS specific information like the tmpdir location called os
, a module for allocating binary chunks of memory called buffer
, some modules for parsing urls and paths (url
, querystring
, path
), etc. Most if not all of the modules in node core are there to support node’s main use case: writing fast programs that talk to file systems or networks.
Node handles I/O with: callbacks, events, streams and modules. If you learn how these four things work then you will be able to go into any module in node core and have a basic understanding about how to interface with it.
This is the most important topic to understand if you want to understand how to use node. Nearly everything in node uses callbacks. They weren’t invented by node, they are just part of the JavaScript language.
Callbacks are functions that are executed asynchronously, or at a later time. Instead of the code reading top to bottom procedurally, async programs may execute different functions at different times based on the order and speed that earlier functions like http requests or file system reads happen.
The difference can be confusing since determining if a function is asynchronous or not depends a lot on context. Here is a simple synchronous example, meaning you can read the code top to bottom just like a book:
var myNumber = 1
function addOne() { myNumber++ } // define the function
addOne() // run the function
console.log(myNumber) // logs out 2
The code here defines a function and then on the next line calls that function, without waiting for anything. When the function is called it immediately adds 1 to the number, so we can expect that after we call the function the number should be 2. This is the expectation of synchronous code - it sequentially runs top to bottom.
Node, however, uses mostly asynchronous code. Let’s use node to read our number from a file called number.txt
:
var fs = require('fs') // require is a special function provided by node
var myNumber = undefined // we don't know what the number is yet since it is stored in a file
function addOne() {
fs.readFile('number.txt', function doneReading(err, fileContents) {
myNumber = parseInt(fileContents)
myNumber++
})
}
addOne()
console.log(myNumber) // logs out undefined -- this line gets run before readFile is done
Why do we get undefined
when we log out the number this time? In this code we use the fs.readFile
method, which happens to be an asynchronous method. Usually things that have to talk to hard drives or networks will be asynchronous. If they just have to access things in memory or do some work on the CPU they will be synchronous. The reason for this is that I/O is reallyyy reallyyy sloowwww. A ballpark figure would be that talking to a hard drive is about 100,000 times slower than talking to memory (e.g. RAM).
When we run this program all of the functions are immediately defined, but they don’t all execute immediately. This is a fundamental thing to understand about async programming. When addOne
is called it kicks off a readFile
and then moves on to the next thing that is ready to execute. If there is nothing to execute node will either wait for pending fs/network operations to finish or it will stop running and exit to the command line.
When readFile
is done reading the file (this may take anywhere from milliseconds to seconds to minutes depending on how fast the hard drive is) it will run the doneReading
function and give it an error (if there was an error) and the file contents.
The reason we got undefined
above is that nowhere in our code exists logic that tells the console.log
statement to wait until the readFile
statement finishes before it prints out the number.
If you have some code that you want to be able to execute over and over again, or at a later time, the first step is to put that code inside a function. Then you can call the function whenever you want to run your code. It helps to give your functions descriptive names.
Callbacks are just functions that get executed at some later time. The key to understanding callbacks is to realize that they are used when you don’t know when some async operation will complete, but you do know where the operation will complete — the last line of the async function! The top-to-bottom order that you declare callbacks does not necessarily matter, only the logical/hierarchical nesting of them. First you split your code up into functions, and then use callbacks to declare if one function depends on another function finishing.
The fs.readFile
method is provided by node, is asynchronous, and happens to take a long time to finish. Consider what it does: it has to go to the operating system, which in turn has to go to the file system, which lives on a hard drive that may or may not be spinning at thousands of revolutions per minute. Then it has to use a magnetic head to read data and send it back up through the layers back into your javascript program. You give readFile
a function (known as a callback) that it will call after it has retrieved the data from the file system. It puts the data it retrieved into a javascript variable and calls your function (callback) with that variable. In this case the variable is called fileContents
because it contains the contents of the file that was read.
Think of the restaurant example at the beginning of this tutorial. At many restaurants you get a number to put on your table while you wait for your food. These are a lot like callbacks. They tell the server what to do after your cheeseburger is done.
Let’s put our console.log
statement into a function and pass it in as a callback:
var fs = require('fs')
var myNumber = undefined
function addOne(callback) {
fs.readFile('number.txt', function doneReading(err, fileContents) {
myNumber = parseInt(fileContents)
myNumber++
callback()
})
}
function logMyNumber() {
console.log(myNumber)
}
addOne(logMyNumber)
Now the logMyNumber
function can get passed in as an argument that will become the callback
variable inside the addOne
function. After readFile
is done the callback
variable will be invoked (callback()
). Only functions can be invoked, so if you pass in anything other than a function it will cause an error.
When a function gets invoked in javascript the code inside that function will immediately get executed. In this case our log statement will execute since callback
is actually logMyNumber
. Remember, just because you define a function it doesn’t mean it will execute. You have to invoke a function for that to happen.
To break down this example even more, here is a timeline of events that happen when we run this program:
fs
and myNumber
are declared as variables while addOne
and logMyNumber
are declared as functions. Note that these are just declarations. Neither function has been called nor invoked yet.addOne
is invoked with the logMyNumber
function passed as its callback
argument. Invoking addOne
will first run the asynchronous fs.readFile
function. This part of the program takes a while to finish.readFile
to finish. If there was anything else to do during this time, node would be available for work.readFile
finishes it executes its callback, doneReading
, which parses fileContents
for an integer called myNumber
, increments myNumber
and then immediately invokes the function that addOne
passed in (its callback), logMyNumber
.Perhaps the most confusing part of programming with callbacks is how functions are just objects that can be stored in variables and passed around with different names. Giving simple and descriptive names to your variables is important in making your code readable by others. Generally speaking in node programs when you see a variable like callback
or cb
you can assume it is a function.
You may have heard the terms ‘evented programming’ or ‘event loop’. They refer to the way that readFile
is implemented. Node first dispatches the readFile
operation and then waits for readFile
to send it an event that it has completed. While it is waiting node can go check on other things. Inside node there is a list of things that are dispatched but haven’t reported back yet, so node loops over the list again and again checking to see if they are finished. After they finished they get ‘processed’, e.g. any callbacks that depended on them finishing will get invoked.
Here is a pseudocode version of the above example:
function addOne(thenRunThisFunction) {
waitAMinuteAsync(function waitedAMinute() {
thenRunThisFunction()
})
}
addOne(function thisGetsRunAfterAddOneFinishes() {})
Imagine you had 3 async functions a
, b
and c
. Each one takes 1 minute to run and after it finishes it calls a callback (that gets passed in the first argument). If you wanted to tell node ‘start running a, then run b after a finishes, and then run c after b finishes’ it would look like this:
a(function() {
b(function() {
c()
})
})
When this code gets executed, a
will immediately start running, then a minute later it will finish and call b
, then a minute later it will finish and call c
and finally 3 minutes later node will stop running since there would be nothing more to do. There are definitely more elegant ways to write the above example, but the point is that if you have code that has to wait for some other async code to finish then you express that dependency by putting your code in functions that get passed around as callbacks.
The design of node requires you to think non-linearly. Consider this list of operations:
read a file
process that file
If you were to turn this into pseudocode you would end up with this:
var file = readFile()
processFile(file)
This kind of linear (step-by-step, in order) code isn’t the way that node works. If this code were to get executed then readFile
and processFile
would both get executed at the same exact time. This doesn’t make sense since readFile
will take a while to complete. Instead you need to express that processFile
depends on readFile
finishing. This is exactly what callbacks are for! And because of the way that JavaScript works you can write this dependency many different ways:
var fs = require('fs')
fs.readFile('movie.mp4', finishedReading)
function finishedReading(error, movieData) {
if (error) return console.error(error)
// do something with the movieData
}
But you could also structure your code like this and it would still work:
var fs = require('fs')
function finishedReading(error, movieData) {
if (error) return console.error(error)
// do something with the movieData
}
fs.readFile('movie.mp4', finishedReading)
Or even like this:
var fs = require('fs')
fs.readFile('movie.mp4', function finishedReading(error, movieData) {
if (error) return console.error(error)
// do something with the movieData
})
In node if you require the events module you can use the so-called ‘event emitter’ that node itself uses for all of its APIs that emit things.
Events are a common pattern in programming, known more widely as the ‘observer pattern’ or ‘pub/sub’ (publish/subscribe). Whereas callbacks are a one-to-one relationship between the thing waiting for the callback and the thing calling the callback, events are the same exact pattern except with a many-to-many API.
The easiest way to think about events is that they let you subscribe to things. You can say ‘when X do Y’, whereas with plain callbacks it is ‘do X then Y’.
Here are few common use cases for using events instead of plain callbacks:
.on('jump', function() {})
.on('incomingRequest')
or .on('serverError')
If we were trying to write a module that connects to a chat server using only callbacks it would look like this:
var chatClient = require('my-chat-client')
function onConnect() {
// have the UI show we are connected
}
function onConnectionError(error) {
// show error to the user
}
function onDisconnect() {
// tell user that they have been disconnected
}
function onMessage(message) {
// show the chat room message in the UI
}
chatClient.connect(
'http://mychatserver.com',
onConnect,
onConnectionError,
onDisconnect,
onMessage
)
As you can see this is really cumbersome because of all of the functions that you have to pass in a specific order to the .connect
function. Writing this with events would look like this:
var chatClient = require('my-chat-client').connect()
chatClient.on('connect', function() {
// have the UI show we are connected
})
chatClient.on('connectionError', function() {
// show error to the user
})
chatClient.on('disconnect', function() {
// tell user that they have been disconnected
})
chatClient.on('message', function() {
// show the chat room message in the UI
})
This approach is similar to the pure-callback approach but introduces the .on
method, which subscribes a callback to an event. This means you can choose which events you want to subscribe to from the chatClient
. You can also subscribe to the same event multiple times with different callbacks:
var chatClient = require('my-chat-client').connect()
chatClient.on('message', logMessage)
chatClient.on('message', storeMessage)
function logMessage(message) {
console.log(message)
}
function storeMessage(message) {
myDatabase.save(message)
}
Early on in the node project the file system and network APIs had their own separate patterns for dealing with streaming I/O. For example, files in a file system have things called ‘file descriptors’ so the fs
module had to have extra logic to keep track of these things whereas the network modules didn’t have such a concept. Despite minor differences in semantics like these, at a fundamental level both groups of code were duplicating a lot of functionality when it came to reading data in and out. The team working on node realized that it would be confusing to have to learn two sets of semantics to essentially do the same thing so they made a new API called the Stream
and made all the network and file system code use it.
The whole point of node is to make it easy to deal with file systems and networks so it made sense to have one pattern that was used everywhere. The good news is that most of the patterns like these (there are only a few anyway) have been figured out at this point and it is very unlikely that node will change that much in the future.
There are already two great resources that you can use to learn about node streams. One is the stream-adventure (see the Learn Node Interactively section) and the other is a reference called the Stream Handbook.
stream-handbook is a guide, similar to this one, that contains a reference for everything you could possibly need to know about streams.
Node core is made up of about two dozen modules, some lower level ones like events
and stream
some higher level ones like http
and crypto
.
This design is intentional. Node core is supposed to be small, and the modules in core should be focused on providing tools for working with common I/O protocols and formats in a way that is cross-platform.
For everything else there is npm. Anyone can create a new node module that adds some functionality and publish it to npm. As of the time of this writing there are 34,000 modules on npm.
Imagine you are trying to convert PDF files into TXT files. The best place to start is by doing npm search pdf
:
There are a ton of results! npm is quite popular and you will usually be able to find multiple potential solutions. If you go through each module and whittle down the results into a more narrow set (filtering out things like PDF generation modules) you’ll end up with these:
A lot of the modules have overlapping functionality but present alternate APIs and most of them require external dependencies (like apt-get install poppler
).
Here are some different ways to interpret the modules:
pdf2json
is the only one that is written in pure JavaScript, which means it is the easiest to install, especially on low power devices like the raspberry pi or on Windows where native code might not be cross platform.mimeograph
, hummus
and pdf-extract
each combine multiple lower level modules to expose a high level APIpdftotext
/poppler
unix command line toolsLets compare the differences between pdftotextjs
and pdf-text-extract
, both of which are are wrappers around the pdftotext
utility.
Both of these:
Just looking at the package.json
+ module statistics it’s hard to get a feeling about which one might be the right choice. Let’s compare the READMEs:
Both have simple descriptions, CI badges, installation instructions, clear examples and instructions for running the tests. Great! But which one do we use? Let’s compare the code:
pdftotextjs
is around 110 lines of code, and pdf-text-extract
is around 40, but both essentially boil down to this line:
var child = shell.exec('pdftotext ' + self.options.additional.join(' '));
Does this make one any better than the other? Hard to say! It’s important to actually read the code and make your own conclusions. If you find a module you like, use npm star modulename
to give npm feedback about modules that you had a positive experience with.
npm is different from most package managers in that it installs modules into a folder inside of other existing modules. The previous sentence might not make sense right now but it is the key to npm’s success.
Many package managers install things globally. For instance, if you apt-get install couchdb
on Debian Linux it will try to install the latest stable version of CouchDB. If you are trying to install CouchDB as a dependency of some other piece of software and that software needs an older version of CouchDB, you have to uninstall the newer version of CouchDB and then install the older version. You can’t have two versions of CouchDB installed because Debian only knows how to install things into one place.
It’s not just Debian that does this. Most programming language package managers work this way too. To address the global dependencies problem described above there have been virtual environment developed like virtualenv for Python or bundler for Ruby. These just split your environment up in to many virtual environments, one for each project, but inside each environment dependencies are still globally installed. Virtual environments don’t always solve the problem, sometimes they just multiply it by adding additional layers of complexity.
With npm installing global modules is an anti-pattern. Just like how you shouldn’t use global variables in your JavaScript programs you also shouldn’t install global modules (unless you need a module with an executable binary to show up in your global PATH
, but you don’t always need to do this – more on this later).
require
worksWhen you call require('some_module')
in node here is what happens:
some_module.js
exists in the current folder node will load that, otherwise:node_modules
folder with a some_module
folder in itThis cycle repeats until node reaches the root folder of the filesystem, at which point it will then check any global module folders (e.g. /usr/local/node_modules
on Mac OS) and if it still doesn’t find some_module
it will throw an exception.
Here’s a visual example:
When the current working directory is subsubfolder
and require('foo')
is called, node will look for the folder called subsubfolder/node_modules
. In this case it won’t find it – the folder there is mistakenly called my_modules
. Then node will go up one folder and try again, meaning it then looks for subfolder_B/node_modules
, which also doesn’t exist. Third try is a charm, though, as folder/node_modules
does exist and has a folder called foo
inside of it. If foo
wasn’t in there node would continue its search up the directory tree.
Note that if called from subfolder_B
node will never find subfolder_A/node_modules
, it can only see folder/node_modules
on its way up the tree.
One of the benefits of npm’s approach is that modules can install their dependent modules at specific known working versions. In this case the module foo
is quite popular - there are three copies of it, each one inside its parent module folder. The reasoning for this could be that each parent module needed a different version of foo
, e.g. ‘folder’ needs foo@0.0.1
, subfolder_A
needs foo@0.2.1
etc.
Here’s what happens when we fix the folder naming error by changing my_modules
to the correct name node_modules
:
To test out which module actually gets loaded by node, you can use the require.resolve('some_module')
command, which will show you the path to the module that node finds as a result of the tree climbing process. require.resolve
can be useful when double-checking that the module that you think is getting loaded is actually getting loaded – sometimes there is another version of the same module closer to your current working directory than the one you intend to load.
Now that you know how to find modules and require them you can start writing your own modules.
Node modules are radically lightweight. Here is one of the simplest possible node modules:
package.json
:
{
"name": "number-one",
"version": "1.0.0"
}
index.js
:
module.exports = 1
By default node tries to load module/index.js
when you require('module')
, any other file name won’t work unless you set the main
field of package.json
to point to it.
Put both of those files in a folder called number-one
(the name
in package.json
must match the folder name) and you’ll have a working node module.
Calling the function require('number-one')
returns the value of whatever module.exports
is set to inside the module:
An even quicker way to create a module is to run these commands:
mkdir my_module
cd my_module
git init
git remote add git@github.com:yourusername/my_module.git
npm init
Running npm init
will create a valid package.json
for you and if you run it in an existing git
repo it will set the repositories
field inside package.json
automatically as well!
A module can list any other modules from npm or GitHub in the dependencies
field of package.json
. To install the request
module as a new dependency and automatically add it to package.json
run this from your module root directory:
npm install --save request
This installs a copy of request
into the closest node_modules
folder and makes our package.json
look something like this:
{
"id": "number-one",
"version": "1.0.0",
"dependencies": {
"request": "~2.22.0"
}
}
By default npm install
will grab the latest published version of a module.
A common misconception about npm is that since it has ‘Node’ in the name that it must only be used for server side JS modules. This is completely untrue! npm actually stands for Node Packaged Modules, e.g. modules that Node packages together for you. The modules themselves can be whatever you want – they are just a folder of files wrapped up in a .tar.gz, and a file called package.json
that declares the module version and a list of all modules that are dependencies of the module (as well as their version numbers so the working versions get installed automatically). It’s turtles all the way down - module dependencies are just modules, and those modules can have dependencies etc. etc. etc.
browserify is a utility written in Node that tries to convert any node module into code that can be run in browsers. Not all modules work (browsers can’t do things like host an HTTP server), but a lot of modules on NPM will work.
To try out npm in the browser you can use RequireBin, an app I made that takes advantage of Browserify-CDN, which internally uses browserify but returns the output through HTTP (instead of the command line – which is how browserify is usually used).
Try putting this code into RequireBin and then hit the preview button:
var reverse = require('ascii-art-reverse')
// makes a visible HTML console
require('console-log').show(true)
var coolbear =
" ('-^-/') \n" +
" `o__o' ] \n" +
" (_Y_) _/ \n" +
" _..`--'-.`, \n" +
" (__)_,--(__) \n" +
" 7: ; 1 \n" +
" _/,`-.-' : \n" +
" (_,)-~~(_,) \n"
setInterval(function() { console.log(coolbear) }, 1000)
setTimeout(function() {
setInterval(function() { console.log(reverse(coolbear)) }, 1000)
}, 500)
Or check out a more complicated example (feel free to change the code and see what happens):
Like any good tool, node is best suited for a certain set of use cases. For example: Rails, the popular web framework, is great for modeling complex business logic, e.g. using code to represent real life business objects like accounts, loan, itineraries, and inventories. While it is technically possible to do the same type of thing using node, there would be definite drawbacks since node is designed for solving I/O problems and it doesn’t know much about ‘business logic’. Each tool focuses on different problems. Hopefully this guide will help you gain an intuitive understanding of the strengths of node so that you know when it can be useful to you.
Fundamentally node is just a tool used for managing I/O across file systems and networks, and it leaves other more fancy functionality up to third party modules. Here are some things that are outside the scope of node:
There are a number of web frameworks built on top of node (framework meaning a bundle of solutions that attempts to address some high level problem like modeling business logic), but node is not a web framework. Web frameworks that are written using node don’t always make the same kind of decisions about adding complexity, abstractions and tradeoffs that node does and may have other priorities.
Node uses JavaScript and doesn’t change anything about it. Felix Geisendörfer has a pretty good write-up of the ‘node style’ here.
When possible node will use the simplest possible way of accomplishing something. The ‘fancier’ you make your JavaScript the more complexity and tradeoffs you introduce. Programming is hard, especially in JS where there are 1000 solutions to every problem! It is for this reason that node tries to always pick the simplest, most universal option. If you are solving a problem that calls for a complex solution and you are unsatisfied with the ‘vanilla JS solutions’ that node implements, you are free to solve it inside your app or module using whichever abstractions you prefer.
A great example of this is node’s use of callbacks. Early on node experimented with a feature called ‘promises’ that added a number of features to make async code appear more linear. It was taken out of node core for a few reasons:
Consider one of the most universal and basic things that node does: reading a file. When you read a file you want to know when errors happen, like when your hard drive dies in the middle of your read. If node had promises everyone would have to branch their code like this:
fs.readFile('movie.mp4')
.then(function(data) {
// do stuff with data
})
.error(function(error) {
// handle error
})
This adds complexity, and not everyone wants that. Instead of two separate functions node just uses a single callback function. Here are the rules:
Hence, the node callback style:
fs.readFile('movie.mp4', function(err, data) {
// handle error, do stuff with data
})
Note: If you don’t know what these things mean then you will likely have an easier time learning node, since unlearning things is just as much work as learning things.
Node uses threads internally to make things fast but doesn’t expose them to the user. If you are a technical user wondering why node is designed this way then you should 100% read about the design of libuv, the C++ I/O layer that node is built on top of.
Creative Commons Attribution License (do whatever, just attribute me)
http://creativecommons.org/licenses/by/2.0/
Donate icon is from the Noun Project
Three-part true crime documentary that’s two episodes longer than it needed to be. The psychological evaluations were of interest, particularly from a “clinical hypnotherapist with training in crystal healing and past life therapy”.
You know… the life that you had before this one? That.
I cannot help rewatch this powerful scene from “Margin Call”. A masterclass in acting by the great Jeremy Irons.
Every sentence, glance, and gesture projects complete and menacing presence, power, and finality, and is done to absolute perfection 👌
Certainly looks like it, Mr. Hlas. Go Hawks 🤘
I’m not saying there’s going to be a schism or anything, but I’m not not saying that, either.
This is just genius. (Cached)
This is the city of Madras
The home of the curry and the dal
Where Iyers speak only to Iyengars
And Iyengars speak only to God.
I’d read this years ago some place and forgot where. Thought it would be in some Religious Studies textbook back from when I was (briefly) a Religious Studies major. Nope! It was the great Paul Erdős!
“Erdős said he’d modelled it after this ditty about the privileged New England families famously known as the ‘Boston Brahmins’.”
This is good old Boston
The home of the bean and the cod
Where the Lowells speak to the Cabots
And the Cabots speak only to God.
The speedup gained from running a program on a parallel computer is greatly limited by the fraction of that program that can’t be parallelized.
Augustine’s Second Law of Socioscience
For every scientific (or engineering) action, there is an equal and opposite social reaction.
Adding manpower to a late software project makes it later.
When a distinguished but elderly scientist states that something is possible he is almost certainly right. When he states that something is impossible, he is very probably wrong.
The only way of discovering the limits of the possible is to venture a little way past them into the impossible.
Any sufficiently advanced technology is indistinguishable from magic.
Any piece of software reflects the organizational structure that produced it.
There is a general tendency toward size increase in evolution.
The most ineffective workers are systematically moved to the place where they can do the least damage: management.
Ellison’s Law of Cryptography and Usability
The userbase for strong cryptography declines by half with every additional keystroke or mouseclick required to make it work.
Ellison’s Law of Data
Once the business data have been centralized and integrated, the value of the database is greater than the sum of the preexisting parts.
As the rate of erroneous alerts increases, operator reliance, or belief, in subsequent warnings decreases.
The more highly adapted an organism becomes, the less adaptable it is to any new change.
The time to acquire a target is a function of the distance to and the size of the target.
There does not now, nor will there ever, exist a programming language in which it is the least bit hard to write bad programs.
Bandwidth grows at least three times faster than computer power.
As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one.
The cost of computing systems increases as the square root of the computational power of the systems.
Hartree’s Law
Whatever the state of a project, the time a project-leader will estimate for completion is constant.
Heisenbug Uncertainty Principle
Most production software bugs are soft: they go away when you look at them.
The time to make a decision is a function of the possible choices he or she has.
Inside every large problem is a small problem struggling to get out.
A task always takes longer than you expect, even when you take into account Hofstadter’s Law.
Jakob’s Law of the Internet User Experience
Users spend most of their time on other sites. This means that users prefer your site to work the same way as all the other sites they already know.
smart(employees) = log(employees), or “No matter who you are, most of the smartest people work for someone else.”
In cryptography, a system should be secure even if everything about the system, except for a small piece of information — the key — is public knowledge.
Given enough eyeballs, all bugs are shallow.
People under time pressure don’t think faster.
In network theory, the value of a system grows as approximately the square of the number of users of the system.
The number of transistors on an integrated circuit will double in about 18 months.
If there are two or more ways to do something, and one of those ways can result in a catastrophe, then someone will do it.
Software is a gas; it expands to fill its container.
The first 90% of the code accounts for the first 90% of the development time. The remaining 10% of the code accounts for the other 90% of the development time.
The explanation requiring the fewest assumptions is most likely to be correct.
Osborn’s Law
Variables won’t; constants aren’t.
Postel’s Law (the second clause of the Robustness Principle)
Be conservative in what you send, liberal in what you accept.
Pareto Principle (a.k.a. “The 80-20 Rule”)
For many phenomena, 80% of consequences stem from 20% of the causes.
Work expands so as to fill the time available for its completion.
Pesticide Paradox
Every method you use to prevent or find bugs leaves a residue of subtler bugs against which those methods are ineffectual.
In a hierarchy, every employee tends to rise to his level of incompetence.
The utility of large networks, particularly social networks, scales exponentially with the size of the network.
The cost of a semiconductor chip fabrication plant doubles every four years.
Sixty-sixty Rule
Sixty percent of software’s dollar is spent on maintenance, and sixty percent of that maintenance is enhancement.
The time it takes your favorite application to complete a given task doubles with each new revision.
For just about any technology, be it an operating system, application or network, when a sufficient level of adoption is reached, that technology then becomes a threat vector.
Ninety percent of everything is crud.
Tesler’s Law of Conservation as Complexity
You cannot reduce the complexity of a given task beyond a certain point. Once you’ve reached that point, you can only shift the burden around.
Weibull’s Power Law
The logarithm of failure rates increases linearly with the logarithm of age.
Software gets slower faster than hardware gets faster.
Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.
The Chess Sets used at the World Chess Championships cost $350 (plus $700 if you want the electronic piece tracking), are likely out of stock if you’d like one, take a lot of training and practice to make, are woodworked in Amritsar, India, and were designed by Daniel Weil, a former partner at Pentagram.
About half the set’s value lies with the most difficult piece to make: the Knight ♘
Here’s Design Week on their conception. More on this Business Insider video.
I had no idea that he performed at the Hancher Auditorium at The University of Iowa in 1997 and mortified most of a crowd of 1,200 excited kids and their parents who’d come to see him, Darrell Hammond, and Jim Breuer at the height of their SNL fame. The 250 or so who’d remained appear to have had a fantastic time:
Here’s The Des Moines Register’s report:
Here’s Jim Breuer on the incident:
Danny Pudi keeping it real.
“Uh… a luxury you can’t live without.”
“A luxury I can’t live without… Coffee. I really like it.”
“Luxury… you can get it anywhere.”
“Ah I guess, yeah. Like good coffee…”
“I love coffee too.”
“I like nice socks.”
“Socks. Your socks you would put in your shoes.”
“Yeah. I really love them. I like kind of like you know, cozy feet.”
“You’re attracted to your socks.”
“I’m attracted to really nice running socks. Like I’m always looking for good running…”
“That’s not a luxury, though. Coffee and socks are not a luxury all.”
“Alright give me a luxury. What luxury should I have?”
“Private plane.”
“Larry. I’m on Duck Tales.”
Saw with NN. At least twice as long as it needs to be. Didn’t care about the score. It’s three hours of Vijay doing Vijay things with gusto. Spoiler: I understand that mass Indian entertainers, particularly the South Indian ones, have a tenuous relationship with reality. But we are to be OK with two siblings, born five years apart, looking like facsimiles of each other. They didn’t even bother shaving the mustache of the younger bro. Come the fuck on.
Saw with BE and NN. Eh. Clear messages about creatives’ struggles and temptations, and the importance of continuing to tell past and present stories of horrific pain and suffering.
I suppose I just lazily wanted to watch a well-made scary movie without actively engaging with it, without searching for the clever and occasionally deep symbolism that has come to characterize a movie with Jordan Peele’s name on it. It was adequately scary.
Speaking of these “twin melodies”: I haven’t seen the 1992 original and it’s on my list. Didn’t know that Philip Glass did the score for the movie.
In many cases, you’re staring at the face of someone who lived centuries ago. That was their hair, their nose, their eye-lashes, their sleep. Very few things are more fascinating than this.
The Borremose Man died in the 7th century BCE. He was bludgeoned to death from the back of his head and had a rope with a slip knot tied around his neck. It is believed that he was a human sacrifice. He was found in the Borremose peat bog in Himmerland, Denmark in 1946. Shortly after, two other, less well preserved, bodies were discovered in the same marsh. Credit: Danish National Museum/Wikimedia Commons
The face of the Tollund Man. Credit: Sven Rosborn/Wikimedia Commons
The Yde Girl died sometime between 54 BCE and 128 CE at an approximate age of 16 years old. She suffered from scoliosis and had long reddish blonde hair that was preserved by the swamp. She was buried with a ritually tied woolen braid around her neck suggesting she was killed as a human sacrifice. However, due to damage to the body at the time of discovery, the cause of her death is unknown. She was found outside the village of the village of Yde, Netherlands. Credit: Drents Museum/Wikimedia Commons
The Grauballe Man died during the late 3rd century BCE when he was around thirty years old. He was found naked, with no indication of any clothing around him. His neck was slit from ear-to-ear in a bog in Jutland, Denmark in 1955. His well-preserved hair was likely dark brown during his life but was turned red by the bog. Historians believe he was likely a human sacrifice. Credit: Sven Rosborn/Wikimedia Commons
The Tollund Man was an approximately 40-year-old man who was killed sometime between 375 and 210 BCE. He was found with a noose around his neck, indicating he was hanged to death, as well as a sheepskin cap on his head. He was found in a bog outside of the Danish town of Silkeborg in 1950. Credit: Wikimedia Commons
The Damendorf Man died around 300 BCE and had his body squashed flat by the weight of the peat that accumulated on top of him. He was found in a bog outside the German town of Damendorf in 1900 with a leather belt, shoes, and a pair of breeches. Credit: Bullenwächter/Wikimedia Commons
The Bocksten Man likely lived sometime between 1290 and 1430. He was a tall, slender man, most likely in his 40s at the time of his death. He was killed and impaled with two wooden poles, one that went directly through his heart, to the bed of a lake that would later become a bog. This impaling likely happened after his death as he also has a large wound on his head. He was found in a bog near Varberg Municipality, Sweden in 1936. His hair was found perfectly preserved, and he was also discovered with a hooded garment and an engraved leather sheath. Credit: Peter Lindberg/Wikimedia Commons
The Arden Woman lived during the 14th Century BCE and was around 20–25 years old at the time of her death. She was found in the Bredmose bog in Hindsted, Denmark in 1942. Police said the corpse was found in a ‘question mark’ shape. Her well-preserved hair was dark blond, drawn into two pigtails, and coiled around the top of her head. Unlike some bog bodies, she was found with garments and with no evidence of a violent death. Credit: P.V. Glob/Wikimedia Commons
The full body of The Grauballe Man. His hands were so well preserved that researchers were able to take the fingerprints of the over 2,000-year-old body. Credit: Colin/Wikimedia Commons
The Clonycavan Man was an Irish man who died sometime between 392 BCE and 201 BCE. He was 5’2, with a squashed nose, crooked teeth, and gelled-up hair. He was killed by an ax blow to the back of his head. The Clonycavan Man was discovered in 2003 in Clonycavan, Ireland when he was picked up by a modern peat harvesting machine that mangled his lower body. His rich diet, imported hair gel, and death near a hill used for kingly initiation led historians to theorize that he was a king who was ritually sacrificed after a bad harvest. Credit: Mark Healey/Wikimedia Commons
The Kreepen Man was a body discovered in a bog in 1903 near Verden, Germany. The body had twisted oak and willow branches binding his hands and feet. After its discovery, the body was sold to The Museum of European Cultures in Berlin but was destroyed when the city was bombed during WWII. Hair found at the site believed to belong to the Kreepen Man, date to between 1440 and 1520, but without the body, the genuine date of death is unknown. Credit: Andreas Franzkowiak/Wikimedia Commons
The Huldremose Woman died sometime between 160 BCE and 340 CE and was over 40 years old at the time of her death. She had a rope around her neck indicating she may have been strangled or hanged to death. There is also a laceration on one of her feet. She was found with an elaborate wool plaid cape, scarf, and skirt. She was found by a school teacher in 1879 in a peat bog near Ramten, Denmark. Credit: Kira Ursem/Wikimedia Commons
The Weerdinge Men are two naked bog bodies found in Drenthe, the Netherlands in 1904. They would have lived sometime between 60 BCE and 220 CE. One of the men had a large cut in his abdomen, through which his intestines spilled out, which some historians believe indicates that he was cut open so an ancient druid could divine the future from his entrails. Credit: Wikimedia Commons
The Röst Girl is thought have died sometime between 200 BCE and 80 CE in a bog in the Schleswig-Holstein state of Germany. She was discovered in 1926, but the cause of her death is unknown because her body was destroyed during WWII. Credit: Wikimedia Commons
The Old Croughan Man lived sometime between 362 BCE and 175 BCE and would have been around 20-years-old at the time of his death. This torso, missing the head and lower body, was discovered in 2003 in a bog near Croghan Hill in Ireland. From his arm-span, it is believed he would have been 6’6. Credit: Mark Healey/Wikimedia Commons
Roter Franz died in the Bourtanger Moor, on what is now the border of Germany and the Netherlands, sometime between 220 and 430 CE during the Roman Iron Age. The name Roter Franz (meaning Red Franz in English) is derived from the red hair and beard discovered on the body. He was killed when his throat was slit and had an arrow wound on his shoulder. Credit: Axel Hindemith/Wikimedia Commons
The Osterby Head was discovered in 1948 in a bog to the southeast of Osterby, Germany. The man whose head this belonged to lived sometime between 75 and 130 CE and was 50 to 60 years of age when he died. Evidence shows that he was struck in the head fatally and then beheaded. His hair was tied in a Suebian knot, indicating he was likely a free man of the Germanic Suebi tribe. Credit: Andreas Franzkowiak/Wikimedia Commons
The Kraglund Man was discovered in 1898 in Nordjylland, Denmark. He is believed to have been male, but there is little documentation, and the body has been lost. He was the first bog body to be photographed before being moved from where it was discovered. Credit: Georg Sarauw /Wikimedia Commons
The Rendswühren Man was a 40 to 50 years old man who died in the 1st century CE. He is believed to have been beaten to death and was buried with his clothing, a rectangular wool cloak, and a fur cape. He was discovered outside the town of Rendswühren in Germany in 1871. Credit: Andreas Franzkowiak/Wikimedia Commons
A picture of the Rendswühren Man taken in 1873, two years after he was discovered. Credit: Johanna Mestorf/Wikimedia Commons
The Roum Head was found in Himmerland, Denmark, and belonged to a man in his 20s who died during the Iron Age. The find was originally titled as “The Roum Woman” until traces of beard stubble were found on the face. Credit: Wikimedia Commons
The Haraldskær Woman was discovered in a bog in Jutland, Denmark in 1892. When she was discovered, she was believed to be Queen Gunnhild of Norway, a quasi-historical figure from around 1000 CE who was said to have been drowned in a bog. Thinking it was their ancient queen, the Danish monarchy had the body placed in an elaborate glass-covered sarcophagus inside St. Nicolai Church in central Vejle, Denmark. In 1977, radiocarbon dating proved that the woman actually lived nearly 1,500 years before the revered queen, and likely died in the 5th century BC. She was around 40 years old at the time of her death. Credit: McLeod/Wikimedia Commons
The Haraldskær Woman in her glass-covered sarcophagus. Credit: Västgöten/Wikimedia Commons
The Kayhausen Boy was a child aged 7 to 10 years old who is thought to have been killed died between 300 and 400 BCE. He had an infected socket at the top of his femur that would likely have made him unable to walk. His killers bound his hands and feet with cloth torn from a fur cape and stabbed him four times. His body was discovered in a sphagnum bog in Lower Saxony, Germany in 1922. Credit: Department of Legal Medicine, Universitatsklinikum Hamburg-Eppendorf
Carved by someone in Ancient Egypt between 3700–3500 BCE.
Academy Award-winning Helen Hunt is a pharma-stunned alien who doesn’t enjoy any screentime in a disjointed plot that prioritizes misdirection over coherence.
Great cinematography. I loved the background score by William Arcane.
I can’t get enough of Gustav Nordgren’s art. Here’s a Mughal outpost. Not a high-enough resolution but tap/click for a larger image.
Zooming in!
Here’s another. It’s Aztec!
Just beautiful stuff 😍
I will always let my puppy inspect everything for approval (and, usually, immediate loss of interest.)
Low resolution but enough to make the point.
See also: Every Website in 2019 and 2018. The mobile web is a garbage nightmare shitshow (especially if you want to read any fucking news.)
Allysson Lucca is a Brazilian designer who took this original list (cached) of what the world would look like with a 1,000 people and shrank it to a hundred.
The idea of reducing the world’s population to a community of only 100 people is very useful and important. It makes us easily understand the differences in the world. There are many types of reports that use the Earth’s population reduced to 100 people, especially in the Internet. Ideas like this should be more often shared, especially nowadays when the world seems to be in need of dialogue and understanding among different cultures, in a way that it has never been before.
Transcribed from the graphic:
“If the world were a village of 1000 people.” This was written in 1990.
If the world were a village of 1000 people:
The people of the village would have considerable difficulty communicating:
In the village there would be:
Sixty of the thousand villagers would be over the age of 65.
Just under half of the married women would have access to and be using modern contraceptives.
Each year 28 babies would be born.
Each year 10 people would die, three of them for lack of food, one from cancer. Two of the deaths would be to babies born within the year.
One person in the village would be infected with the HIV virus; that person would most likely not yet have developed a full-blown case of AIDS.
With the 28 births and 10 deaths, the population of the village in the next year would be 1018. In this thousand-person community, 200 people would receive three-fourths of the income; another 200 would receive only 2% of the income. Only 70 people would own an automobile (some of them more than one automobile).
About one-third would not have access to clean, safe drinking water. Of the 670 adults in the village half would be illiterate. The village would have 6 acres of land per person, 6000 acres in all of which:
If the world were a village of 1000 persons, there would be five soldiers, seven teachers, one doctor. Of the village’s total annual expenditures of just over $3 million per year, $181,000 would go for weapons and warfare, $159,000 for education, $132,000 for health care.
The village would have buried beneath it enough explosive power in nuclear weapons to blow itself to smithereens many times over. These weapons would be under the control of just 100 of the people. The other 900 people would be watching them with deep anxiety, wondering whether the 100 can learn to get along together, and if they do, whether they might set off the weapons anyway through inattention or technical bungling, and if they ever decide to dismantle the weapons, where in the village they will dispose of the dangerous radioactive materials of which the weapons are made.
I’m put off by the word “gubernatorial” whenever I see it. Seems very silly, saccharine, like something a 5-year old mispronounced in 1953 that just stuck because it was so cute 🙄
Nope.
This is indescribably badass.
It belongs to Tom Bosworth. I thought “I cannot even run that fast” and decided to look at the video.
While it certainly does look like they’re running, there are some severe restrictions on their movement, of course, else it’d be an event called ‘Trotting’. The rules are:
Judges look out any infractions by eye (no technology) and disqualify people appropriately.
When I was a kid, I’d close my eyes and allow the strange shapes caused by the night lamp and blood flow in my eyelids to put on a small dance and lull me to sleep. Many a time, as I’d relax and drift away, I would suddenly feel this ‘inflation’ and loss of personal boundary and sense of geometry. In one instance, I’d be the infinitely small nucleus of a sphere whose inner surface would race away very quickly from me. In another, I’d be viewing an animal (mostly elephants because I love them) or an object that would grow and warp very quickly in size and texture. I don’t know how to even express the rest. They should’ve sent a poet, etc. While most were very pleasant, a few would be horripilating enough where I’d have to open my eyes and situate myself to snap out of whatever was happening.
Finding out that you’re not the only weirdo who experiences (and likes) certain things is probably one of the greatest joys of the Internet. Just off this one thread on /r/meditation:
I sometimes get strong changes in my sense of physical size and location. These either pass as my concentration increases or they stick for a while. My head might feel very high up or large. My feet feel as if they’re below the floor or further behind me than they could possibly be. My whole sense of physical space can get warped and skewed so that even thoughts that arise about physical space seem to be confused, e.g. flattening 3D space into 2D or flattening everything into a line. It’s enough to be a distraction sometimes. (source)
Cool, this is the first time I’ve heard someone express a situation similar to mine. During a sit one time I kept inflating to the size of a massively obese man. The effect was so real I had to stop and open my eyes to make sure nothing bizarre was actually happening. (source)
Mhm, I’ve had that too. It felt like I was turing into the marshmellow man from ghostbusters. (source)
What I felt as a child is common and normal in children, and in adults who are red-belts at meditation. It’s a level of jhana (that’s a Pali word, dhyana is its Sanskrit twin), one of the two forms of meditation in Theravada Buddhism (vipassana being the other.)
Jhana has eight levels of practice “first codified by Buddhists over 2,000 years ago.” I’m going to smush this article (“A”) and this Britannica entry (“B”) into a quick description of each.
A - When internal concentration is strong enough, J1 is entered, accompanied by strong physical pleasure—“better than sexual orgasm” ([9] p.151)—and greatly reduced vigilance with smaller startle responses[…]
B - Initially, the Theravadin meditator seeks to achieve detachment from sensual desires and impure states of mind through reflection and to enter a state of satisfaction and joy.
A - In J2 joy “permeates every part of the body,” but with less physical pleasure. | In the second stage […], intellectual activity gives way to a complete inner serenity; the mind is in a state of “one-pointedness” (concentration), joy, and pleasantness.
B - In the second stage of this form of meditation, intellectual activity gives way to a complete inner serenity; the mind is in a state of “one-pointedness” (concentration), joy, and pleasantness.
A - In J3, the character of joy changes to “deep contentment and serenity.”
B - In the third stage, every emotion, including joy, has disappeared, and the meditator is left indifferent to everything.
A - J4 is described by “equanimity—a profound peace and stillness.”
B - In the fourth stage, satisfaction, any inclination to a good or bad state of mind, pain, and serenity are left behind, and the meditator enters a state of supreme purity, indifference, and pure consciousness.
A - The higher-numbered jhanas J5–J8 are characterized by more subtle and profound perceptions […] Each jhana is reported to be deeper and more remote from external stimuli than the last
B - The dhyanas are followed by four further spiritual exercises, the samapatti-s (“attainments”):
So there 🙏🧘♀️📿
Don Gorske started eating them in 1972 and continues to do so. He will buy 6-8 at a time twice a week (at the same McDonald’s franchise) to save on gas. He’s kept all boxes and receipts, which I suppose are what you’d need to apply for and maintain a Guinness Record.
Emphasis mine:
Seriously, if you have five minutes, give the whole video a watch. Even if the idea of eating Big Macs every day isn’t for you, there’s something to be said for Gorske’s power of persistence and the joy he finds not only in his routine but in being himself. Sure, it’s not necessarily the noblest of records, but at a time when people are winning medals for artistic swimming and table tennis, who’s really to say which feats are more notable than others?
Two things:
Lovely Python-esque stuff.
I love these etudes by Professor Yun Shin. I saw them at the Des Moines Art Center a few years ago.
All images © the artist
There isn’t much variety in the music I listen to. I stick to soundtracks, minimalist composers, some weird surprises1, and mostly to what my good friend calls “electronic windchime shit” (by which he means “ambient music.”)
This means I’ve heard fewer than, say, twenty country songs in my life so far (aside from Dolly Parton ♥️) And this the country-est of them all.
I suppose it’s nice that the boffins behind YouTube think I need a change of pace 🎣
I will listen to things if the album art looks interesting.↩︎
She addresses graduate/PhD students struggling to complete their theses but there’s quite a bit to learn here. She considers procrastination as a perfectly logical response: why wouldn’t one seek pleasure? It’s something that (a) reveals a lot about what you’re afraid of and (b) is hence a protection response.
Other notes, thoughts, etc:
Over the past year, I’ve been amazed by how much mindfulness comes up with almost every conversation I have (or book I read or podcast I listen to) about self improvement and Joy in Life.
Update
Was reading an article on the efficacy of todo lists and lo!
It’s the same deal as with weight loss: you will lose weight if your kitchen only contains healthy and low-sugar food.↩︎
I used to have a printout of this at my desk at work because I just loved looking at it so much 🌸♥️ It was pretty popular on the internet a while ago. The little girl’s name is Butedmaa and she was just 5 when this picture was taken in 2003 by photographer Han Chengli, who titled it “Inner Mongolian Child”. Here’s another of her with her family in 2014.
Birthday gift from TK. Sharp, vibrant, funny, and dark. The characters’ philosophical ruminations are self-indulgent and sophomoric and tedious. Don’t know if that was the point.
Glass Octopus (Vitreledonella richardi)
Longarm Octopus larva (Macrotritopus defilippi)
Marine Snail (Atlanta inclinata)
Sea Butterfly (Clio chaptali)
Eye-flash Squid (Abralia veranyi)
Deep sea eel
Male copepods
Larval Prawn (Plesiopenaeus armatus)
Glass Squid (Bathothauma lyromma)
Note that the company still had a market cap of $5B at the time of this writing.