Two of Bill Gates’s favorite soup-to-nuts books of the past decade are Steven Pinker’s “The Better Angels of Our Nature” and Yuval Noah Harari’s “Sapiens.”
The appetite for such stories seems indiscriminate—tales of deterioration and tales of improvement are frequently consumed by the same people.
Perhaps what readers like Gates find valuable in these books
has less to do with the purported shape and direction of history
than with the broad assurance that history has a shape and a direction.
“The Dawn of Everything: A New History of Humanity” (Farrar, Straus & Giroux), is a profuse and antic account of how we came to take that old narrative for granted and why we might be better off if we let it go.
Viewed closely, the course of human history resists our favored schemata.
Hunter-gatherer communities seem to have experimented with various forms of farming as side projects thousands of years before we have any evidence of cities.
Even after urban centers developed, there was nothing like an ineluctable relationship between cities, technology, and domination.
Çatalhöyük isn’t the only site that calls into question the presumption
that the Neolithic era was patterned on a single civilizational kit.
Graeber and Wengrow report that some cities thrived long before they showed signs of hierarchical systems—
such as temples and palaces—and some never developed them at all.
“In others, centralized power seems to appear and then disappear,” they write.
“It would seem that the mere fact of urban life does not, necessarily, imply any form of political organization.”
If cities didn’t lead to states, what did?
Modern ethnographic treatments of Indigenous communities describe an astonishing level of social plasticity
one of the first things you learn in an introductory course in anthropology or archeology is that pat appeals to cultural evolution are retrograde and silly.
Critiques of grand narratives have been important to the modern self-image of these fields—in part as penance for having once been happy to serve the priorities of empire, peddling “civilization” as a gift to the “primitives.”
One consequence, however, is that wholesale synthetic accounts of human history tend to be written in the extravagantly roughshod mode of Harari’s “Sapiens” or Jared Diamond’s “Guns, Germs, and Steel.”
anthropology was reduced to “spiteful ethnography,”
it put itself in the business of “disapproving of intellectual constructions
but not of creating, or perhaps even of understanding, any.”
If something did go terribly wrong in human history—and given the current state of the world, it’s hard to deny something did—
then perhaps it began to go wrong precisely when people started losing that freedom to imagine and enact other forms of social existence.”
[The framework of historical narrative that deems] history as a story of material progress recasts indigenous critics as innocent children of nature,
whose views on freedom were a mere side effect of their uncultivated way of life and could not possibly offer a serious challenge to contemporary social thought
Most IDEs primarily serve two purposes for students.
First, in a Java-focused curriculum, it insulates the student from the javac command line program, and the command line environment itself.
Second, it catches some basic mistakes and allows the student to defer learning about the finnicky language requirements
that aren’t deemed core to the curriculum, like imports and file naming requirements.
What they can’t do, unless they’ve figured it out on their own, is operate a computer outside of the confines of the IDE they’ve been taught.
In none of the curricula I’ve seen, through personal experience or reading syllabi provided by other students,
is there a place for students to get past the myriad of barriers
that constitute the use of a computer in the modern day.
Students who use Windows aren’t taught that, while their file system is case-insensitive, not all filesystems are.
They probably aren’t taught that a “file system” is a concept until a 300-level operating systems course.
Students who use Mac OS aren’t taught what the
.DS_Storedirectory is, or why it’s irrelevant to their project submissions.
Students learning Java don’t know that javac is their compiler and java their virtual machine, at least until they take a course in compilers.
Nobody focuses on things like ASCII, Unicode, and UTF-8, or on how programs interoperate, or on how to share and distribute programs that students write.
Introductory CS curricula focus on abstract ideas of programming, and use IDEs to accomplish that.
Why is this a problem?
“Software is nothing but the details.”
When students don’t understand what a file is, or haven’t ever edited text in anything but Microsoft Word and don’t realize they can edit code outside of an IDE3,
they will not be able to do the crucial work of self-directed learning that is a hallmark of all computer science success.
When students have only ever programmed in Java using some bespoke learning library provided by their professor,
it will take them much longer than necessary to figure out other languages, other libraries, and other approaches.
In a field that moves as fast as this one does, that’s a very serious problem.
It also undermines their ability to learn in a classroom setting going forward.
Among my fellow students, those who merely do what is expected of them
Most importantly, though, it limits the ability of their peers to learn.
Students need to know how to use computers before they can program them in a serious way.
Moving forward - or backwards, or sideways?
Provide a standardized environment - as a VM, perhaps, or using something like repl.it or ideone - perhaps a similar software designed specifically for education.
Use a language that teaches the fundamentals of the paradigm you’re interested in, like Scheme or Python. (Please, please not Java.)
Provide support for students who are interested in doing their own thing.
We need to teach students about computers themselves.
After the first foray into programming, take time to teach students about the UNIX command line.
Show students how things they’re familiar with - graphical file managers, for instance - interact with these new command line skills, and how their programming languages interact with those.
Commit to teaching a standardized environment that all students have access to.
Ubuntu LTS is great for this, because people with their own computer can run it in a VM and the school can provide computers, and adventurous Windows users can use WSL.
A commenter on Lobste.rs pointed out that there are some great resources from MIT around learning tooling and build systems: The Missing Semester of Your CS Education
Does anybody remember the Second Life boom when companies were trying to snap up linden-land and set up shop online? That failed, and I can't help but feel like the 'metaverse' concept being marketed to us is that, but with VR helmets and advertising strapped on.
The pitch from Meta and with AR/VR is that with the ability to use 3D space, you can actually turn Second Life-style virtual worlds into something useful with actual tangible benefits.
I'm shocked people are taking "Meta" at face value.
To me it's a way to signal to outsiders that Facebook is still cool and hip. That's it.
Now when recruiting they can play the Meta-not-Facebook angle.
Now if they can pad earnings calls with the amazing success the metaverse is seeing (so what if it's losing us money, that's the future!)
It's like an inverse Alphabet. Where Alphabet silently serves as an umbrella for moonshots, Meta is a moonshot that's an umbrella for boring old Facebook
Second Life allowed for a lot of the NSFW content and interactions that people tend to enjoy both in entertainment and in real life (this is also true of VRChat to some extent).
Metaverse will be a sanitized, sterile project for children.
Fundamentally the people like Zuckerberg responsible for its execution do not understand what people want, which is why Metaverse has no chance of success.