When I was in high school and college, I thought science classes were pretty boring. Why did I have to learn this stuff? Who cared about the phases of cell mitosis, the difference between ionic and covalent bonds, and Newton’s laws of motion? What practical value would science ever hold for me?
I wasn’t alone in my opinion that science was boring and had no relevance to my daily life. The US is currently facing a crisis of far too few students entering STEM fields, and I think basic science education is a major contributing factor.
I’m not saying I had bad science teachers. I had great science teachers! The problem isn’t with teachers themselves—it’s with how our education system demands that science be taught.
In most of the western world, science is presented as an endless litany of facts. Students memorize enough information to pass their tests, but too few students leave the classroom with a true understanding of what science actually is, and why it’s so important.
And because we see science as a meaningless encyclopedia of boring facts, a wide variety of misconceptions about science creeps into our worldviews as we age and form our identities. This makes us susceptible to all sorts of agenda-driven groups who manipulate our opinions and perceptions of science, its practitioners, and its findings.
It’s hard for me to describe how deeply this saddens me, because today, I love science, and I value the clear arc of technological progress that science has …
A growing community of people—both laypeople and experts—believe that since computers are getting exponentially smarter (and have been for many decades), they will become more intelligent than humans sometime within the next fifty years, and when they do they will be a major threat to us. I am one such worrywart. But when weighing this claim, many people—both laypeople and experts—often bring a specific, and horribly wrong, counterargument against the worrywarts. That argument goes something like this:
“A conscious computer? Like in the movies? Give me a break. Computing technology is centuries away from being able to create a machine that has feelings, awareness, and a sense of selfhood like that of humans. The human brain is far too complex. It’s pointless to worry so much about something that won’t exist for hundreds of years.”
This argument fails to realize what AI actually is. The first artificial superintelligence will not be a conscious being. It won’t have feelings, it won’t “hate humans”, it won’t be aware of its own capacity to think, and it won’t have a mind that can process and reflect on subjective experiences the way ours can. The skeptics are right that, given current trends of technological growth, we’re probably centuries away from being able to create such a truly conscious being. But that’s not the point, and it never has been.
The point is that consciousness and intelligence are not the same thing. And AI researchers aren’t trying to build artificial consciousness (AC). They’re trying to …
In an effort to be more patriotic, I’ve recently decided to start spending my personal income in the same increments as our federal government spends its revenues. First I pay off my numerous debt obligations, and then I have a certain amount left to spend.
This leftover amount is what I call my “discretionary spending”. Here’s what it looks like:
The smaller slices pay for my smaller, unimportant expenses, like the power bill, school, food, and rent. I could almost do without those things, as could we all. But do you notice the big blue slice that takes up almost half of the pie? Like our government, for the sake of my own safety, I’ve decided to spend almost half my money on this one thing. I’ve decided to spend almost half my money on…
While typing this blog post, I’m currently surrounded by a squad of heavily armed soldiers securing my house. But don’t worry: they’re part of my very own private paramilitary security force! I just hired them, and they’re only here for my protection. They’re here to keep me safe.
I’ve hired these brave soldiers to equip my house’s perimeter with an electric fence, barbed wire, and pillboxes, from which machine gunners constantly monitor my house and its surroundings. A cluster of land mines has been placed under my driveway, just in case an enemy vehicle tries to penetrate my …
Well, I finally saw the film La La Land and I have some opinions on it.
It’s a good movie. Actually it’s a great movie, and you should see it. It’s romantic, and sizzling with enthusiasm for life and for art, and directed with incredible class and style. In fact I’m super jealous of Damien Chazelle, who directed it, because he’s around my age and is way more talented than I am. How’d you get such mad directing skills, dude?! (Oh, you studied filmmaking at Harvard? I guess that explains that.)
As someone who’s always wanted to make a feature length musical film (still a bucket list item for me!), I got lost in the musical numbers, the choreography, and just how damn well the whole thing is put together. And La La Land provoked some strong reactions in me, as I think it will in any young artist, or young-at-heart artist. It made me feel that raw hunger for art that I felt in my early twenties, that feeling that no matter what I was working on, it was important, and necessary. It made me feel the calling that all artists feel in a fresh and new way.
That drive, that hunger—it never left me. It’s just as strong as it ever was, but it’s become less raw and more focused as I’ve gotten a little older. My artistic hunger has become so focused, actually, that it drove a wedge between myself and La La Land, a film …
In the final episode of Cosmos: A Spacetime Odyssey, Neil deGrasse Tyson says something that I think demonstrates profound humility. After teaching us about the wonders of the universe for twelve inspiring episodes, he dedicates the final episode to explaining why curiosity and openness are so important. And at one point he looks directly at the camera and tells us, “Question everything. Even me.”
That was the exact moment when I first became a die-hard Neil deGrasse Tyson fan. He’s so dedicated in his quest for truth that he actually invites scrutiny and challenges to his own ways of thinking. He’s so eager to know what’s true that if he’s wrong about something, he actually wants you to prove him wrong so his beliefs will more closely mirror reality. These traits run so counter to the common, unspoken plea of many other thought leaders: “Question everything. Except me.” And unfortunately, it’s not only the bad leaders who adopt this code. Many otherwise good leaders follow it, too.
To me, one of the most frustrating aspects of this election season has been our willingness, both as individuals and as a society, to condemn authoritarianism when we disagree with it and yet eagerly embrace it when it caters to our own cherished beliefs. When our political opponents want to silence us and bend us to their will, we see that as evil. But somehow we think it’s okay for us to silence our political opponents and bend them to our will. …
Has anyone ever written the Game of Thrones of time travel? By which I mean the definitive, sweeping epic that exemplifies the time travel subgenre in the popular imagination? I’m honestly asking; I don’t know the answer. I can point to the definitive time travel film—Back to the Future—but not necessarily the definitive time travel book. Which is weird, because I love time travel.
For every other genre of speculative fiction, there’s a particular, well-known book or two that encapsulates the genre in its grandest, most complex, most fully realized and developed state. Fantasy fans might point to A Song of Ice and Fire or the Malazan series. Sci-fi fans would probably point to Dune, or maybe Foundation. Horror fans can point to The Shining, or if not The Shining, then at least to the best of Stephen King’s work.
Even subgenres have definitive works. Military sci-fi has Starship Troopers and The Forever War. Cyberpunk has Neuromancer and Snow Crash. Weird fiction has the Cthulhu Mythos, post-apocalyptic fiction has The Stand, urban fantasy has The Dresden Files. I could go on.
But for the life of me, I’ve never been able to find the single time travel book that anyone can point to and say, “That! That right there. That one book is the time travel genre, defined.” You might say H. G. Wells’ The Time Machine fits the bill, but it was published in 1895, and is more of a …
Demons are a fascinating concept to me, largely because they’re the most untapped of the great movie monsters.
Vampires? Played out. Werewolves? Snore. Ghosts? Blah. But I think there’s still a lot that could be done with demons, if we just had the imagination to think outside the box we’ve put the devil in.
Unfortunately, the creative world seems to be stuck with three stereotypes of demons:
1.) The loud, feral, violent, sex-crazed, Catholic demons you see in every possession film that knocks off The Exorcist.
2.) The snarling Buffy-esque minion-type demons with copious movie makeup that are basically indistinguishable from orcs.
3.) The hunky antihero fallen angel whose fall from Heaven was nothing compared with how deeply he falls in love with our virtuous young heroine, with whom he can never be due to X, Y, and Z. (I obviously love the antihero part; the rest not so much.)
Protestants have their own demon myths, but for some reason the “Protestant demon” has never really caught on in pop culture, to the best of my knowledge. Pop Christian fiction contains plenty of demon stories, most notably Frank Peretti’s super-popular This Present Darkness and C.S. Lewis’s classic The Screwtape Letters. Sometime during my undergraduate film studies, almost a decade ago, the thought first occurred to me that someone should write a horror-fantasy story about these Protestant demons, but one that’s not in any way religious and can appeal to a broader audience. And soon, as so many story …
Earlier this month, Elon Musk caused quite the hubbub on the internet when he claimed that we are almost definitely living in a simulated reality—i.e. that our reality isn’t “real” in the traditional sense, and we’re essentially living inside of a computer program. If you’ve never heard anyone say this before, it probably seems like a pretty bizarre thing to believe. “Eccentric Billionaire Thinks We Live in the Matrix” is a clickbait title you’d easily scroll right past on Facebook.
But I think Elon Musk stands a good chance of being right. I just think so for an entirely different reason.
Although philosophical debates about the nature of reality stretch all the way back to antiquity, Musk is getting his ideas from a specific and very influential paper called “Are You Living in a Computer Simulation?” written by philosopher Nick Bostrom in 2003. If you’ve never read it, you should. It’s a pretty mind-blowing piece of writing, and it was the first formal presentation of what Bostrom calls “the simulation argument.”
The argument basically says that because the technology that allows us to simulate virtual worlds (like video games) is constantly improving, eventually we’ll be able to simulate virtual worlds perfectly, so that they’ll be indistinguishable from reality. Even the people in those simulations will be indistinguishable from real people. At this point, when our descendants decide to run simulations, those simulations themselves will internally give rise to perfect simulations of their own: simulations within simulations. So …
In the very near future—possibly by the end of this decade—3D printers will become as cheap as regular printers, and about as ubiquitous. At that point, anyone with access to a 3D printer will be able to torrent the design for any type of weapon off of the internet, print that weapon’s parts, assemble them, and then use the resulting weapon however they wish. This includes mass shooters.
Different 3D printers use different materials, and print with different levels of detail. You wouldn’t use the same 3D printing machine to print big plastic blocks of uniform structure as you’d use to print some of the more intricate metal parts of a gun. But since all types of 3D printers will have myriad applications in every conceivable industry, all types of 3D printers will have a high level of ubiquity. So a potential mass shooter should have little problem accessing them, even if the one he uses at home can’t print quite all of the parts he needs.
The design of weapons is adaptable, too. No company would sell a gun (by which I mean a working, lethal gun) made of rubber, but given the breadth and depth of human knowledge, I guarantee you that someone on the internet can build a gun made exclusively of rubber that could kill a person. So if a killer has access to only one type of printing material, that won’t necessarily stop him from printing a gun. (Although I admit that a mass…
“Art is finally democratized! Anyone can access the tools of production! Sure it can be tough to reach an audience, but if you work hard and persevere, you have a legitimate shot of making a living doing what you love. The gatekeepers have been abolished! The indie revolution is here!”
So goes the common refrain.
And that refrain is certainly truer for more indie artists now than ever before in history. But all artistic mediums are not created equal. As someone who comes from the indie film world, who has friends in the indie music world, and who is now setting sail in the indie publishing world, I’m consistently surprised by how the indie revolution has manifested itself differently in each different medium. Although I do have more questions than answers about this.
For instance, why did the various indie revolutions start at different times? Indie music started taking off in the late 90s, indie film in the early 00s, indie games in the late 00s, and indie publishing not until the early 10s. This makes no sense to me, since publishing is the least technologically complex of all the artistic mediums listed above, and so it’s presumably the easiest to democratize, taking control away from The Man and putting it in your hands and mine. Publishing is essentially just printing words on a page. Shouldn’t the indie revolution have hit publishing first? But instead it hit publishing last.
But that’s beside the point. Perhaps my biggest question about the …