You read that correctly. There have been all sorts of theories as to why discrimination towards women seems so pervasive and near-universal, and from where it comes from to begin with. But a crude farming tool is by far the most interesting and unexpected origin. As the Economist – my most cherished and regularly read source – recently reported, a team of economists, of all people, set out to prove that the adoption of the plow coincided with a change of attitudes towards women that persists to this day.
Specifically, a move towards large-scale and labor-intensive agriculture – defined by the adoption of the heavy plow – created an economic system in which one’s physical strength and endurance became a major basis for productivity, and they key to society’s survival. Men were naturally more adept in this new function, and from this crucial role they would subsequently come to dominate…
A cutting-edge scientific analysis shows that a Briton from 10,000 years ago had dark brown skin and blue eyes.
Researchers from London’s Natural History Museum extracted DNA from Cheddar Man, Britain’s oldest complete skeleton, which was discovered in 1903.
University College London researchers then used the subsequent genome analysis for a facial reconstruction.
It underlines the fact that the lighter skin characteristic of modern Europeans is a relatively recent phenomenon.
No prehistoric Briton of this age had previously had their genome analysed.
As such, the analysis provides valuable new insights into the first people to resettle Britain after the last Ice Age.
The analysis of Cheddar Man’s genome – the “blueprint” for a human, contained in the nuclei of our cells – will be published in a journal, and will also feature in the upcoming Channel 4 documentary The First Brit, Secrets Of The 10,000-year-old Man.
Cheddar Man’s remains had been unearthed 115 years ago in Gough’s Cave, located in Somerset’s Cheddar Gorge. Subsequent examination has shown that the man was short by today’s standards – about 5ft 5in – and probably died in his early 20s.
Prof Chris Stringer, the museum’s research leader in human origins, said: “I’ve been studying the skeleton of Cheddar Man for about 40 years
“So to come face-to-face with what this guy could have looked like – and that striking combination of the hair, the face, the eye colour and that dark skin: something a few years ago we couldn’t have imagined and yet that’s what the scientific data show.”
Fractures on the surface of the skull suggest he may even have met his demise in a violent manner. It’s not known how he came to lie in the cave, but it’s possible he was placed there by others in his tribe.
The Natural History Museum researchers extracted the DNA from part of the skull near the ear known as the petrous. At first, project scientists Prof Ian Barnes and Dr Selina Brace weren’t sure if they’d get any DNA at all from the remains.
But they were in luck: not only was DNA preserved, but Cheddar Man has since yielded the highest coverage (a measure of the sequencing accuracy) for a genome from this period of European prehistory – known as the Mesolithic, or Middle Stone Age.
They teamed up with researchers at University College London (UCL) to analyse the results, including gene variants associated with hair, eye and skin colour.
Extra mature Cheddar
They found the Stone Age Briton had dark hair – with a small probability that it was curlier than average – blue eyes and skin that was probably dark brown or black in tone.
This combination might appear striking to us today, but it was a common appearance in western Europe during this period.
Steven Clarke, director of the Channel Four documentary, said: “I think we all know we live in times where we are unusually preoccupied with skin pigmentation.”
Prof Mark Thomas, a geneticist from UCL, said: “It becomes a part of our understanding, I think that would be a much, much better thing. I think it would be good if people lodge it in their heads, and it becomes a little part of their knowledge.”
Unsurprisingly, the findings have generated lots of interest on social media.
Cheddar Man’s genome reveals he was closely related to other Mesolithic individuals – so-called Western Hunter-Gatherers – who have been analysed from Spain, Luxembourg and Hungary.
Dutch artists Alfons and Adrie Kennis, specialists in palaeontological model-making, took the genetic findings and combined them with physical measurements from scans of the skull. The result was a strikingly lifelike reconstruction of a face from our distant past.
Pale skin probably arrived in Britain with a migration of people from the Middle East around 6,000 years ago. This population had pale skin and brown eyes and absorbed populations like the ones Cheddar Man belonged to.
No-one’s entirely sure why pale skin evolved in these farmers, but their cereal-based diet was probably deficient in Vitamin D. This would have required agriculturalists to absorb this essential nutrient from sunlight through their skin.
“There may be other factors that are causing lower skin pigmentation over time in the last 10,000 years. But that’s the big explanation that most scientists turn to,” said Prof Thomas.
Boom and bust
The genomic results also suggest Cheddar Man could not drink milk as an adult. This ability only spread much later, after the onset of the Bronze Age.
Present-day Europeans owe on average 10% of their ancestry to Mesolithic hunters like Cheddar Man.
Britain has been something of a boom-and-bust story for humans over the last million-or-so years. Modern humans were here as early as 40,000 years ago, but a period of extreme cold known as the Last Glacial Maximum drove them out some 10,000 years later.
There’s evidence from Gough’s Cave that hunter-gatherers ventured back around 15,000 years ago, establishing a temporary presence when the climate briefly improved. However, they were soon sent packing by another cold snap. Cut marks on the bones suggest these people cannibalised their dead – perhaps as part of ritual practices.
Britain was once again settled 11,000 years ago; and has been inhabited ever since. Cheddar Man was part of this wave of migrants, who walked across a landmass called Doggerland that, in those days, connected Britain to mainland Europe. This makes him the oldest known Briton with a direct connection to people living here today.
This is not the first attempt to analyse DNA from the Cheddar Man. In the late 1990s, Oxford University geneticist Brian Sykes sequenced mitochondrial DNA from one of Cheddar Man’s molars.
Mitochondrial DNA comes from the biological “batteries” within our cells and is passed down exclusively from a mother to her children.
Prof Sykes compared the ancient genetic information with DNA from 20 living residents of Cheddar village and found two matches – including history teacher Adrian Targett, who became closely connected with the discovery. The result is consistent with the approximately 10% of Europeans who share the same mitochondrial DNA type.
Developers of platforms such as Facebook have admitted that they were designed to be addictive. Should we be following the executives’ example and going cold turkey – and is it even possible for mere mortals?
Facebook’s locked-down nature means mere mortals can’t see the private posts on Zuckerberg’s timeline, but it is hard to imagine him getting into arguments about a racist relative’s post of an anti-immigration meme. And it is not just Zuckerberg. None of the company’s key executives has a “normal” Facebook presence. You can’t add them as friends, they rarely post publicly and they keep private some information that the platform suggests be made public by default, such as the number of friends they have.
Over at Twitter, the story is the same. f the company’s nine most senior executives, only four tweet more than once a day on average. Ned Segal, its chief financial officer, has been on the site for more than six years and has sent fewer than two tweets a month. Co-founder Jack Dorsey, a relatively prolific tweeter, has sent about 23,000 since the site was launched, but that is a lot less than even halfway engaged users have sent over the same period. Dorsey rarely replies to strangers and avoids discussions or arguments on the site. He doesn’t live-tweet TV shows or sporting fixtures. In fact, he doesn’t really “use” Twitter; he just posts on it occasionally.
I am a compulsive social media user. I have sent about 140,000 tweets since I joined Twitter in April 2007 – six Jacks’ worth. I use Instagram, Snapchat and Reddit daily. I have accounts on Ello, Peach and Mastodon (remember them? No? Don’t worry). Three years ago, I managed to quit Facebook. I went cold turkey, deleting my account in a moment of lucidity about how it made me feel and act. I have never regretted it, but I haven’t been able to pull the same stunt twice.
I used to look at the heads of the social networks and get annoyed that they didn’t understand their own sites. Regular users encounter bugs, abuse or bad design decisions that the executives could never understand without using the sites themselves. How, I would wonder, could they build the best service possible if they didn’t use their networks like normal people?
Now, I wonder something else: what do they know that we don’t?
“The thought process that went into building these applications, Facebook being the first of them … was all about: ‘How do we consume as much of your time and conscious attention as possible?’ That means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content and that’s going to get you … more likes and comments,” he said.
“It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. The inventors, creators – me, Mark [Zuckerberg], Kevin Systrom on Instagram, all of these people – understood this consciously. And we did it anyway.”
A month later, Parker was joined by another Facebook objector, former vice-president for user growth Chamath Palihapitiya. “The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation; misinformation, mistruth,” Palihapitiya said at a conference in Stanford, California. “This is not about Russian ads. This is a global problem. It is eroding the core foundations of how people behave by and between each other. I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”
Palihapitiya’s statements rattled Facebook so much that the company issued a response acknowledging its past failings – a rare move for a business that, despite its mission to “connect people”, is notoriously taciturn about its shortcomings. “When Chamath was at Facebook, we were focused on building new social media experiences and growing Facebook around the world,” a company spokeswoman said. “Facebook was a very different company back then … as we have grown, we have realised how our responsibilities have grown, too. We take our role very seriously and we are working hard to improve.”
A few days later, the site pulled a more interesting move, releasing the results of research that suggested that Facebook did make users feel bad – but only if they didn’t post enough. “In general, when people spend a lot of time passively consuming information – reading, but not interacting with people – they report feeling worse afterward,” two Facebook researchers said in a review of the existing literature. On the other hand, “actively interacting with people – especially sharing messages, posts and comments with close friends and reminiscing about past interactions – is linked to improvements in wellbeing”. How convenient.
For Adam Alter, a psychologist and the author of Irresistible, an examination of technology addiction, it is almost beside the point whether social media makes you happy or sad in the short term. The deeper issue is that your usage is compulsive – or even addictive.
“The addiction concept applies much more broadly and to many more behaviours than we perhaps thought and also therefore applies to many more people in the population,” Alter says. “Roughly half the adult population has at least one behavioural addiction. Not many of us have substance addictions, but the way the world works today there are many, many behaviours that are hard for us to resist and a lot of us develop self-undermining attachments to those behaviours that border on or become addictions.”
These addictions haven’t happened accidentally, Alter argues. Instead, they are a direct result of the intention of companies such as Facebook and Twitter to build “sticky” products, ones that we want to come back to over and over again. “The companies that are producing these products, the very large tech companies in particular, are producing them with the intent to hook. They’re doing their very best to ensure not that our wellbeing is preserved, but that we spend as much time on their products and on their programs and apps as possible. That’s their key goal: it’s not to make a product that people enjoy and therefore becomes profitable, but rather to make a product that people can’t stop using and therefore becomes profitable.
“What Parker and Palihapitiya are saying is that these companies, companies that they’ve been exposed to at the highest levels and from very early on, have been founded on these principles – that we should do everything we possibly can to hack human psychology, to understand what it is that keeps humans engaged and to use those techniques not to maximise wellbeing, but to maximise engagement. And that’s explicitly what they do.”
Parker and Palihapitiya aren’t the only Silicon Valley residents to open up about their unease with the habit-forming nature of modern technology. As the Guardian reported in October, a growing number of coders and designers are quitting their jobs in disillusionment at what their work entails. From Chris Marcellino – one of the inventors of Apple’s system for push notifications, who quit the industry to train as a neurosurgeon – to Loren Britcher – who created the pull-to-refresh motion that turns so many apps into miniature one-armed-bandits and is now devoting his time to building a house in New Jersey – many of the workers at the coalface of interface design have had second thoughts.
Others have had the same realisation, but have decided to embrace the awkwardness – such as LA-based retention consultants Dopamine Labs. The company offers a plugin service that personalises “moments of joy” in apps that use it. It promises customers: “Your users will crave it. And they’ll crave you.”
If this is the case, then social media executives are simply following the rule of pushers and dealers everywhere, the fourth of the Notorious BIG’s Ten Crack Commandments: “Never get high on your own supply.”
“Many tech titans are very, very careful about how they privately use tech and how they allow their kids to use it and the extent to which they allow their kids access to screens and various apps and programs,” says Alter. “They will get up on stage, some of them, and say things like: ‘This is the greatest product of all time,’ but then when you delve you see they don’t allow their kids access to that same product.”
Last week, Apple’s chief executive, Tim Cook, told the Guardian: “I don’t have a kid, but I have a nephew that I put some boundaries on. There are some things that I won’t allow. I don’t want them on a social network.”
He added: “Technology by itself doesn’t want to be good and it doesn’t want to be bad either. It takes humans to make sure that the things that you do with it are good. And it takes humans in the development process to make sure the creation of the product is a good thing.”
Alter says that the classic example of this approach is Cook’s predecessor, Steve Jobs, “who spoke about all the virtues of the iPad and then wouldn’t let his kids near it”. (“They haven’t used it,” Jobs told a New York Times reporter a few months after the iPad was released. “We limit how much technology our kids use at home.”)
It is not only children. “You can see it in their own behaviour,” Alter says. “Jack Dorsey, the way he uses Twitter, it seems he’s very careful about how much time he spends. He’s obviously a very busy guy and a very high-functioning guy, but as a result he’s probably distracted by a lot of other things and he’s able to tear himself away from the platform.
“But that’s not true for all of the users of Twitter – many of them report being, using the colloquial term, addicted. Whether or not that’s clinical addiction, it feels to them like they would like to be doing less; it’s undermining their wellbeing. And I think that’s absolutely right: for many Twitter users, it’s sort of a black hole that sucks you in and it’s very hard to stop using the program.”
That is certainly how I feel about Twitter. I have tried to cut back, after realising how much of my time was spent staring at a scrolling feed of aphorisms ranging from mildly amusing to vaguely traumatic. I deleted 133,000 tweets, in an effort to reduce the feeling that I couldn’t give up on something into which I had sunk so much time. I removed the apps from my phone and my computer, forcing any interaction through the web browser. I have taken repeated breaks. But I keep coming back.
It is one thing to be a child with a protective parent keeping technology away from you. It is quite another to live like a technology executive yourself, defeating the combined effort of thousands of the world’s smartest people to instil a craving to open their app every day. I am not alone in struggling.
Kevin Holesh, a freelance app developer, is one of those who tried to cut back. He wrote a program, Moment, that tracks how long you spend looking at your phone each day. For the average user, it is more than three hours every day. Holesh’s stats were enough to provide the motivation for change. “Once I had the hard data, that itself was helping me use my phone less. I’ve taken a few steps in that direction since, but I knew just seeing the number itself was half the battle. Seeing that number really changed my approach … I was spending an hour a day not doing anything productive, just wasting time.”
Holesh eventually removed all social networks, and his work email account, from his phone. “That’s the step that helps me the most, simply not having it accessible. At first, my mission was just: find out what amount of phone time makes you happy. But now I’ve started a little more extreme approach, I’m less stressed out about news articles or my uncle posting something inflammatory on Facebook. I find I do a better job of communicating in more old-fashioned methods.”
Alter says willpower can help to a certain extent, while leaving things out of reach for casual, thoughtless use can help more. Ultimately, however, addictions are hard to break alone.
“It’s possible that in 20 years we’ll look back at the current generation of children and say: ‘Look, they are socially different from every other generation of humans that came before and as a result this is a huge problem and maybe we need to regulate these behaviours.’ Or perhaps we’ll look back and say: ‘I don’t know what the fuss was – I’m not sure why we were so concerned.’ Until we have some evidence, until there’s something that seems tangible, I think it’s going to be very hard to get people en masse to change how they behave.”
If you can’t bring yourself to cut back on social media, you could try following Zuckerberg’s example and hire a team of 12 to do it for you. It might not be as cheap and easy as deleting Facebook, but it is probably easier to stick to.
Sean Austin said, “My family was extremely disappointed when I told them I don’t believe in God.”
Sean Austin said his family’s relationship changed when he told them he did not believe in God.
Austin told his family last Christmas, two years after he stopped believing in God. “They were extremely disappointed,” Austin said, who described his family as very religious. “All through Christmas Eve, Christmas day … the entire break we were having arguments constantly.”
“They were disappointed that I had given up faith so easily,” Austin said. “They assumed I was being weak. They thought they had raised me wrong.”
Austin is a junior at DePaul University where he is a member of the DePaul Alliance for Freethought, a group for students who do not believe in or question God’s existence. Austin said he had never met another black atheist before he came to college.