Posted at 09:22 AM in Movies | Permalink | Comments (0) | TrackBack (0)
Gavin Rothery developed the stunning look behind "Moon." Turns out he's also nice enough to list 34 movies he says are worth watching. A couple of my favorites are there (Solaris, Dark City, along with the bettter known Blade Runner/Star Wars/Matrix). But there are a good dozen that I -- and probably you -- haven't had the pleasure to see yet. Here's the full list.
Posted at 04:20 PM in Movies | Permalink | Comments (0) | TrackBack (0)
We draw on our past experiences to extrapolate about the future, so it's interesting to find that the period we remember best is our 20s. But why?
As Katy Waldman at Slate explains, one theory is that it's the time of life when we experience a large number of important events, such as jobs, independent living and marriage, for the first time. We remember novel situations better, so that makes some sense. Another option is that the 20s are a time when the brain is operating at peak efficiency, so it's simply doing a better job of recording the information for later recall. Then there's the most tanalizing explanation; the 20s are a decade when we're creating narratives of who we are for the rest of our lives. Snip:
"In 2002, following in the footsteps of the narrative theorists, Berntsen and David Rubin advanced a “life script” account of the reminiscence bump. They defined the life script as a culturally conditioned storyline of events that make up a skeletal life course—and claimed that people often consult such a template when asked to remember their pasts."
If this is true, then we could better understand certain demographics by looking at the events that defined their 20s. If you're 40 today, your narrative (plus or minus a few years) is based on the 1990s. If you're 70, it's the 1960s. This is a fun exercise right up to the point you think about people now in their 30s. They'll best remember 2000-2010.
The full piece is here.
Posted at 11:19 PM in Psychology | Permalink | Comments (0) | TrackBack (0)
This ad ran about 30 seconds ago during the Golden Globes, so I'll be light on the commentary. Suffice it to say, GE's new commercial, "Robots on the Move," pays beautiful homage to past visions of our robotic future.
(I think we can also all agree that it could be the trailer for the best reunion TV show ever.)
Posted at 09:19 PM | Permalink | Comments (0) | TrackBack (0)
"The Last Myth" has earned its place alongside Philip Zimbardo's "The Time Paradox" and I.F. Clarke's "The Pattern of Expectation" as one of the best books on humanity's concept of "the future." While its title promises an exploration of how apocalyptic thinking evolved, authors Mathew Barrett Gross and Mel Gilles also serve up answers to two other big questions: why did humanity change from thinking that time is circular to linear, and how has the idea of progress changed from the Renaissance through today? All three ideas are woven together in a compelling, jargon-free narrative that is — no pun intended — revelatory.
For example, there have been a number of points where people (granted, slowly) made a 180-degree change in how they think about time. In its earliest days, humanity interpreted life events as the forces of destruction seeking balance with the forces of creation. That changed into a concept of good continually battling evil. There was another period when people thought that every action was a repetition of what ancestors had done before them; there was nothing new under the sun. Over time, that opinion shifted to thinking each event is unprecedented and so history is leading us to some specific point, often utopian or dystopian in nature.
The authors wrap up their work by highlighting two alarming trends. The first is that apocalyptic thinking has hit levels in the past decade in America that haven't been seen worldwide in a thousand years. And second, the desire to view global events through an apocalyptical lens is clouding the ability to tackle real problems (see "With 2012 Over, is the Apocalypse Dead?").
Suffice it to say, "The Last Myth" will be found educational and enjoyable by historians, futurists, and anyone who wants a fresh take on the concept of time itself.
Posted at 11:58 AM in Books | Permalink | Comments (0) | TrackBack (0)
If you remember how I excited I was when I found a book that profiled 16 archetypes, you can imagine my reaction when I saw one that does it for 60! Even better, this beautiful beast, Archetypes in Branding: A Toolkit for Creatives and Strategists, is written not for screenwriters, but for marketers. It has 60 baseball card-style cards — one for each archetype — that let you literally sift through to find the right match for the company or executive you're branding.
This is how I looked to the guy at the cash register.
In addition to all of these profiles, authors Margaret Hartwell and Joshua Chen load up the introduction with the best collection of definitions of archetypes I've seen. (My favorite is from Jon Howard-Spink: "A universally familiar character or situation that transcends time, place, culture, gender and age. It represents an eternal truth.") There are also sections on the differences between stereotypes and archetypes, the application of archetypes in an ethical manner and how they can be applied to a brand to express its disparate components as one narrative.
All of these elements make this "kit" a must-have for corporate storytellers. My one question for the Hartwell and Chen: when can we get the archetype cards in app format? It would be great to have the "cards" as something that could be flipped through on a mobile, with living comments below where people could discuss them Reddit-style.
Posted at 12:05 PM in Archetypes, Books | Permalink | Comments (0) | TrackBack (0)
Over at the World Future Society's site, I take a look at whether closing the books on 2012 means we're done with doomsday prophesies: "With 2012 Over, Is the Apocalypse Dead?"
The short answer is "no." But an exploration of why people like to believe in things like the (supposed) Mayan prediction reveals a lot about our culture and how we might tackle the legitimate threats ahead.
Enjoy!
Posted at 04:28 PM in Dystopianism | Permalink | Comments (0) | TrackBack (0)
In a must-read post by David Weinberger, he explores and explains why so many people -- Republicans, journalists and others -- all looking at the same polling data, came to different conclusions about who was going to win the presidency. Here's a snip, but go there to read it all:
"As a matter of empirical fact, data does not drive agreement, or at least doesn’t drive it sufficiently strongly that by itself it settles issues. For one reason or another, some responsible adults are going to get it wrong. This doesn’t mean we should give up. It certainly doesn’t lead to a relativist conclusion. It instead leads to an acceptance of the fact that we are never going to agree, even when the data is good, plentiful, and right in front of our eyes. And, yeah, that’s more than a little scary."
Posted at 04:36 PM | Permalink | Comments (0) | TrackBack (0)
What can Google's search data reveal about our hunger to know today what will happen tomorrow?
Using Google Trends and Google’s Keyword Tool, I examined the top 350 future-oriented keywords. Combined, they represent 152,620,240 searches a month. Here are four of the major trends uncovered.
1. We think about the future most often in December and least often in July. The motivation behind this clockwork activity goes beyond New Year’s resolutions. People are looking for predictions about their careers, spouses, industries and even countries.
2. Four industries are top of mind: One third of the 350 keywords focus on just four sectors -- Technology, Auto, Entertainment and Finance. Anecdotally, we know that the concepts of technology and the future are often linked. What's different now is we can measure the degree to which that’s true. We also see the presence of sectors that aren’t as obviously linked to futurism.
3. The only countries present in the data are the U.S., India and Pakistan. I’ll leave it to others to guess why that's the case, but what’s startling is the absence of searches for European, Asian and African nations.
4. Psychics? Really??? One in five future-oriented keywords reveal people looking for straight-out predictions. Together, they represent 34,889,400 searches a month! That's intriguing at first blush, but the data quickly takes a dismaying turn. Nearly everyone is looking to supernatural sources, like psychics and astrology, rather than rational ones grounded in science or professional forecasting. Maybe it's all in good fun looking up horoscopes, but still...99% is a big number.
Other interesting findings:
How does this square with your expectations about how other people feel about the future? Leave your comments below, and check out Google Trends and Google's Keyword tool and play around with the data. If you see something interesting, please share!
Posted at 11:29 PM in Futuring, Predicting, Social Media | Permalink | Comments (0) | TrackBack (0)
Keith Melton's 50-film montage of space helmets reveals a lot about our hopes and fears of stepping off the planet.
A handful of the helmets are duplicates from real space exploration, like those from Apollo 13. But the rest have been made up to imagine what would be needed in a variety of environments.
Major themes are:
Posted at 02:32 PM | Permalink | Comments (0) | TrackBack (0)
Jon Evans at TechCrunch is willing to do what nearly no other journalist will -- review his past year's predictions to see what he got right and wrong. Evans gave himself a B+ for performance, but if we grade it on a curve for intellectual honesty, it's an "A."
The trailer for the new zombie flick "World War Z" is out. Sure, it has Brad Pitt. But the book's main innovation -- telling a story in the guise of a UN-style report -- was what made it so scarily realistic. Here it is on Amazon.
Hollywood uses just three theories of time travel in all its movies.
What would the US flag look like if it added a 51st state?
Vint Cerf's vision of an internet that could be used for interplanetary communication took another step forward. An astronaut on the International Space Station used "Disruption-Tolerant Networking" to manipulate a robot on Earth.
How would the 2012 election have turned out if the past 200 years of history had taken different turns? Buzzfeed provides us with this set of electoral maps that show the differences had women, people under the age of 21 and African Americans not been enfranchised.
iO9's gallery of buildings inspired by rocket ships reveals how deeply the space age aesthetic has become a part of the literal landscape.
Posted at 09:51 AM | Permalink | Comments (0) | TrackBack (0)
Watch this send up of TED Talks courtesy of The Onion. It not only captures the archetype of today's innovator, but also what it looks like the moment an innovative idea breaks into the mainstream:
Contrast this with the older archetypes of invention: There's Edison, the "Wizard of Menlo Park" who had reporters come to him to reveal the latest creations of his invention "factory." Or, the well-worn Manhattan Project approach where a group of elite scientists toil in secret until something world-changing and always dangerous is tested in some desert. Go back further and there's Galileo presenting his telescope to the Venetian Senate. Each capture the economies and power structures of their times. Each remains a part of our understanding and expectation of what innovation looks like.
My favorite quote from the Onion video:"We're looking in the eyes of two horrible birds, and we just need a rock that's big enough, efficient enough and innovaive enough to bludgeon them. That rock is an idea."
Posted at 09:13 AM in Archetypes | Permalink | Comments (0) | TrackBack (0)
Here are this week's best news, stories and memes on the topic of tomorrow.
The peer reviewed journal of the Geological Society of America has published a study on the evolution of the idea of creationism and the surprisingly prominent role that geological science played within it. Snip: "As realization grew that the world was unimaginably old, those seeking to reconcile biblical interpretation with geological findings employed two primary arguments. The day-age theory held that each day in the biblical week of creation corresponded to a geologic or cosmic age. The other theory, known as the gap theory, held that God created the world long ago but remodeled it for human use a few thousand years ago. The time in between wasn’t recorded in the Bible, creating an indeterminate gap between the first two verses of Genesis."
The Vintage Ads blog has a collection of retro ads for clocks that show how the devices have marketed as symbols of status, nature, business, convenience and technological advancement.
On March 20, 2013, an artist will lower a 25' star-shaped sculpture containing the DNA of thousands of people into the depest part of the ocean. You know, in case we ever need to revive the species. The project's creator says of this "Deep Storage Project," "It's 2001 a space odyssey, in reverse. Instead of monoliths left to alert God to our technological enlightenment, we are leaving ourselves inside monoliths, to chronicle our technological self-destruction."
There's a word for "the day after tomorrow." It's overmorrow.
Reflecting on how Republicans are processing the defeat of Mitt Romney, Gail Collins points out that "Almost everybody thinks of the world of their youth as the traditional world. In the future, today’s teenagers will be looking back and mournfully declaring that traditional America was a place where folks really knew how to Twitter."
Posted at 07:54 AM | Permalink | Comments (0) | TrackBack (0)
The Internet Archive has just posted every edition of OMNI Magazine free for downloading, including beautiful PDFs that capture the art and ads of the time.
Launched in 1978, OMNI had a profound effect on science literacy and people's understanding of tomorrow. From Wikipedia: "Science Digest and Science News already served the high-school market, and Scientific American and New Scientist the professional, while OMNI was arguably the first aimed at "armchair scientists" who were nevertheless well informed about technical issues."
Reflecting the spirit of its times, both the NY Times and The Economist added science sections to their roster a month after the OMNI first dropped. The publication's 1986 cover story on nanotechnology brought that nascent field into the mainstream (see this timeline from MetaModern) and its art helped establish the futuristic aesthetic of the 80s (10 covers here). It was also an early nurturer of nerd culture. As LiveScience notes, "As a short hand for an early version of geek-chic, OMNI appeared in "Ghostbusters", "The Breakfast Club" and "Star Trek IV", and was mentioned in "Jurassic Park" and the remake of "The Fly."
In 1982, OMNI published a "Future Almanac" which, if you were to read it, Ben Bova assured, "very little that happens in the next two decades will be a surprise." By the way, copies of it are available for $4 via Amazon, and yes, I just bought a copy!
Here's an ad for OMNI from 1978 where it declares itself, "The magazine of tomorrow, on sale today."
Posted at 12:39 PM in Pop Culture, Science | Permalink | Comments (1) | TrackBack (0)
Is this video futuristically retro or retro futuristic? I can't tell, but it's awesome. MakerBot shows how you can make -- LITERERALY -- an 80's style mix tape by 3d printing it and then adding songs via iTunes. Don't forget to give it to your crush, because some things just work no matter what decade you're from.
Posted at 09:45 AM in 3D Fabrication | Permalink | Comments (0) | TrackBack (0)
When people think "science," they wrongly and often think "permanence." The Brainpickings blog has collected definitions of science from leaders who personify the field, and yet their explanations contradict what we hold true in our popular imagination. Some examples:
"All of science is uncertain and subject to revision. The glory of science is to imagine more than we can prove" - Freeman Dyson
"Science is a way of thinking much more than it is a body of knowledge." - Carl Sagan
Confusion about what "science" means has a profound impact on public policy discussions and affects how people make decisions as they ascend up levels of certainty. Consider the word "theory," as in "the theory of evolution" or "big bang theory." Read this great layman's explanation of what a theory actually is, versus how we use the term in common culture (tip of the hat to Reddit for surfacing):
"The difference between a theory and a law is not one of 'truth' or in how confident we feel about it. It is not a difference in degree. Theories are not 'inferior facts.' Theories don't graduate to become 'laws' by being 'proven'...A theory is an explanation for facts. A theory can embody a large set of statements which can grow as the theory expands to explain more observations, more facts.
It explains facts. It cannot 'become' a fact...The theory of gravity is NOT the 'dispute over whether gravity exists...it means 'what explains gravity, or what is is explained by gravity."
Posted at 12:09 PM in Linguistics, Science | Permalink | Comments (0) | TrackBack (0)
In "The Weatherman is Not a Moron," Nate Silver gives a fascinating tour of how forecast models have evolved and delves into how uncertainty is communicated to the public.
First up is the idea of modeling and an idea that Hari Selden himself would feel at home with. Snip:
"In 1814, the French mathematician Pierre-Simon Laplace postulated that the movement of every particle in the universe should be predictable as long as meteorologists could know the position of all those particles and how fast they are moving. Unfortunately, the number of molecules in the earth’s atmosphere is perhaps on the order of 100 tredecillion, which is a 1 followed by 44 zeros. To make perfect weather predictions, we would not only have to account for all of those molecules, but we would also need to solve equations for all 100 tredecillion of them at once."
Strikingly, it's the integration of the unknown into models that is making them more accurate. Like stock traders, meteorologists have begun to play the odds, acknowledging that they're an inherent part of the system.
Unfortunately, communicating that uncertainty carries cultural baggage, motivating experts to hold back their full analysis. Private sector forecasters have a "wet" bias for predictions of rain, because the public is happy if it's sunny when rain is predicted, and angry if it rains when sun is forecast. In a more serious situation, Silver reports there was a storm in 1997 in Grand Forks North Dakota where the forecast's margin of error wasn't shared with the public. The result? Millions in damage that could have been averted through mitigation efforts like sandbagging.
"The forecasters later told researchers that they were afraid the public might lose confidence in the forecast if they had conveyed any uncertainty."
Posted at 12:10 PM in Futuring, Predicting, Science | Permalink | Comments (0) | TrackBack (0)
It's an interesting -- if irritating -- phenomenon. You're at a party, someone makes a witty joke at your expense, and it's only later that you think of the perfect comeback. "If only I had come up with it sooner!" you think, replaying the scene in your mind and creating an alternate reality of how the rest of the night would have turned out. (Remember George Costanza's "the jerk store called, and they're running out of you!")
Well, the French have a term for this: L'esprit de l'escalier or "staircase wit." The philosopher Denis Diderot came up with it after an unfortunate dinner. From Wikipedia:
"During a dinner at the home of statesman Jacques Necker, a remark was made to Diderot which left him speechless at the time, because, he explains, "l’homme sensible, comme moi, tout entier à ce qu’on lui objecte, perd la tête et ne se retrouve qu’au bas de l’escalier" ("a sensitive man, such as myself, overwhelmed by the argument levelled against him, becomes confused and can only think clearly again [when he reaches] the bottom of the stairs"). In this case, “the bottom of the stairs” refers to the architecture of the kind of hôtel particulier or mansion to which Diderot had been invited."
Posted at 11:58 AM in Linguistics | Permalink | Comments (0) | TrackBack (0)
We're all time travelers, we just happen to be moving into the future one second at a time. So it's pleasantly shocking to look back and see how far we've come. The two videos below make that clear. The first, from College Humor, portrays how the TV show "24" would have looked if it had been made in 1994. It illustrates the gradual nature of radical change. For example, 20 years ago we had:
College Humor: "24: The Unaired 1994 Pilot"
The second video, from Amazon.com, shows the exponential nature of small changes. New innovations don't just make one thing better, they set off a series of chain reactions in other areas of life. If you have two innovations, but impact is, at a minimum, squared. The .com age brought us thousands of innovations, with the result being a totally different sense of what "normal" is. (Listen for where the ad says "And when we build you something new you can expect everything to change a little more.")
Amazon.com Kindle Ad: "Normal Just Begs to be Messed With"
Posted at 10:08 AM | Permalink | Comments (0) | TrackBack (0)
Check out this great article in the Washington Post that explores how Bond movies prepared a generation for gadgets. It notes, "Bond taught us to think big when it comes to innovation, and it was never 'incremental' — it was always terribly 'disruptive.'" Here's the slideshow.
Posted at 09:04 AM in Futuring, Pop Culture | Permalink | Comments (0) | TrackBack (0)
The Storytelling Animal asks the question, "why are people programmed to love stories" and gives a highly credible answer. In short? Just like the opposable thumb, the ability to understand, create and share stories is an evolutionary advantage. There are a few reasons why this could be true. Stories are:
1. A low-cost method of storing and transmitting information. That's handy when you're trying to survive for thousands of years without pencil and paper to capture lessons learned.
2. Just like a peacock's tail is a reflection of its health and strength -- and so proof of suitability as a mate -- stories are a way of showing off cognitive and memory capabilities. In other words, storytellers are hot and as a result had more children.
3. Simulators for managing future social interactions. Cooperation and competition have been the two models humanity has used to advance itself, and those that could both better increased their chances of surviving and passing on their genes. This also explains why the heart of story is conflict and why children's imaginations and animal's dreams are filled with fights and chases.
The author, Jonathan Gottschall, does a great job of revelaing how much of our daily lives are consumed by story. Sure, we all know TV shows, commercials and books rely on narrative. But so too daydreams (2,000 a day!), night dreams (which almost always have a story like structure) and coverage of the news, politics and sports. Story really is everywhere.
Our brains's strong desire to find patterns and meaning in everything also has side effects. Conspiracy theories, which always offer simple reasons and good versus evil structures, are one. The desire to see oneself as a heroic protagonist also leads individuals to misremember past events. This even happens on the nation level, where countries have their own creation myths that skip over the bad stuff.
The Storytelling Animal is a terrific read. I highly recommended to marketers, futurists...come to think of it, everyone should read this.
Posted at 12:49 AM in Evolution, Evolutionary Psychology | Permalink | Comments (0) | TrackBack (0)
Everyone's an optimist until they get shot in the head with a magnetic gun.
But seriously, researchers at Universty College London have just published research that shows there is a part of the brain responsible for our "optimism bias." That's the miscalculation that each of us makes about how bad things are more likely to happen to other people than to ourselves.
Using a technique called "transcranial magnetic stimulation," researchers turned off a portion of the brain called the "left inferior frontal gyrus" which tracks information that is better than the person expected. The right portion handles information that is worse than expected. The result? Test subjects believed the chance of bad things happening to them increased to their rightful amount.
As TheScientists notes, "Of course, it may not always be beneficial to suppress such natural biases. Unwarranted optimism can lead us to take unnecessary risks with our health or finances, but Sharot and Kanai noted that it could also be adaptive, by encouraging people to try new things or avoid the stress of potential illness or failure."
Posted at 11:42 AM in Neurology, Predicting | Permalink | Comments (0) | TrackBack (0)
Talk about being poor predictors! It turns out that 54.8% of people who tell posters that they don't think they'll vote, end up voting. And on the other side of the coin, 13.3% of those who say they are "almost certain" to vote, don't show up at the polls. This matters a lot when it comes to predicting turnout of elections and it also reveals a type of prediction people often make about themselves and get wrong.
Tip of the hat to Politico for uncovering this gem. Here's the Hardvard study they source, "Why Bother Asking? The Limited Value of Self-Reported Vote Intention."
Posted at 09:56 AM in Predicting | Permalink | Comments (0) | TrackBack (0)
Interesting post over at iO9 about where the idea of the Rapture originated, how it spread and evolved and which churches do and do not believe in it. The idea has gained significant traction in pop culture due to the "Left Behind" series.
Posted at 11:00 AM in Dystopianism, Religion | Permalink | Comments (0) | TrackBack (0)
Here are this week's best news, stories and memes on the topic of tomorrow. Miss something? Add it to the comments below :)
E.O. Wilson has a fascinating op-ed on how altruisim and selfishness evolved as advantageous traits in humans and what the future of this inner conflict holds. He says, "We will find a way eventually to live with our inborn turmoil, and perhaps find pleasure in viewing it as a primary source of our creativity."
If aliens attack, Americans trust President Obama more than Romney to handle the situation. [HuffPo]
"Generational warfare" is a term thrown around too often these days, but David Leonhardt has a hyperbole-free column on the dangerous divide that has opened between old and young. He concludes of the younger generation, "They wish the country would devote more attention to its future, especially on education and the climate. They, of course, will have to live with that future." Read the whole thing here.
The phrase "uncanny valley" has been popping up a lot (IEEE, The Verge, PC Magazine, NY Times). It refers to the point at which a technology's helpfulness veers from useful to creepy. Check out Wikipedia's entry on it and then the IEEE link for the full translated version of the 1970 paper that brought the meme into existence.
A Back to the Future hoax spred like wildfire on Facebook this week. In BTF II, Marty McFly traveled to October 21, 2015, but someone mocked up an image of the Delorean's dashboard to make it look like he traveled to June 27, 2012. A LOT of Facebookers thought they were in the future, but alas, it's still the past ;)
And on June 30th, a "leap second" will be added to the day due to the Earth slowing down slightly. Enjoy the extra moment's rest. Literally.
Posted at 03:14 PM in This Week | Permalink | Comments (0) | TrackBack (0)