Category Archives: News & current affairs

Repost: Everyone’s Missing the Obvious About the Declining U.S. Birth Rate

(Reposted from https://hurnpublications.com/2019/01/28/everyones-missing-the-obvious-about-the-declining-u-s-birth-rate/)

Everyone’s Missing the Obvious About the Declining U.S. Birth Rate

7 COUNTERARGUMENTS IN RESPONSE TO ANYONE WHO BLAMES THE BABY BUST ON WOMEN OR MILLENNIALS.

Image result for declining birth rate in us

For the past several days, my Facebook feed, Twitter timeline, and evening news have been filled with stories on the Center for Disease Control and Prevention’s latest report about the declining birth rate of U.S. women.

Despite the breadth of the data included in its January 2019 vital statistics update, the CDC statistic generating the biggest headlines is the one that calculates the birth rate in the U.S. to be 16 percent below the amount needed to replace our population over time.

Most of the stories dominating the news cycle have sensational, clickbait headlines: “Women in the U.S. Are Having Fewer Babies” (Time); “U.S. Fertility Rates Have Plummeted Into Uncharted Territory, and Nobody Knows Why” (Science Alert); “The U.S. Is in the Danger Zone for a ‘Demographic Time Bomb’” (Insider); and “Florida, U.S. Have a Baby Problem” (Orlando Sentinel). Among my personal favorites are the headlines where women are blamed as if it’s all immaculate conception — “Women Aren’t Having Enough Babies to Replace Ourselves” (Moms).

But all of the news noise is missing the glaringly obvious facts that every millennial I know recognizes immediately. Here are seven real reasons behind the declining birth rate:

1. The U.S. has the highest maternal mortality rate of all developed countries.

When NPR and ProPublica investigated maternal mortality in the U.S., their findings were clear: “More American women are dying of pregnancy-related complications than any other developed country. Only in the U.S. has the rate of women who die been rising.”

Here, around 26 out of every 100,000 pregnant women die each year, and while some European and Eurasian countries have rates in the teens, the U.S. maternal mortality rate is rising while most other countries are seeing their rates decline. The U.K., for example, has a rate of around nine in every 100,000 women dying, but the Lancet noted that country’s efforts to reduce maternal mortality has meant “being pregnant in the U.K. has never been safer.” Not to mention that our rate remains three times greater and is increasing.

The lowest rates around the globe are three deaths per 100,000. Which means that compared with women in places like Finland, Iceland, and Greece, mothers in the U.S. are dying at a rate that’s nine times greater. That’s not even taking race into account.

But it’s not just a higher risk of death here in the U.S. that makes birth difficult, it’s also a matter of cost.

2. Giving birth in the U.S. is exceedingly expensive.

The United States is the most expensive place in the world to give birth. In the Guardian’s analysis, it costs around $32,000 to give birth vaginally in the U.S. if you don’t have insurance. If you require a C-section, those costs increase to around $51,000.

Think about that: Without insurance, it’s the financial equivalent of an extremely nice car or multiple years of tuition at a state university merely to have a baby. And that’s if everything goes well. If there are complications — either for the child or the mother — those costs can quickly escalate to six figures or more.

Say you’re lucky enough to have great insurance, though — even then, insurers only negotiate the cost down to around $10,000 for a vaginal birth. In Spain (which has the supposedly dreadful socialized medicine), it costs insurance companies a mere average $1,950 for women to give birth vaginally.

That’s not a matter of lower cost for less quality; Spain’s maternal mortality rate is five times lower than the U.S. So while pregnant women in the U.S. with insurance might be paying less out of pocket, the costs paid by their insurance companies are still astronomically high and are no doubt reclaimed by charging higher insurance rates.

What we have in the U.S. then, is — on top of a high mortality rate — the added burden of giving birth made to be ridiculously expensive. And once you have the baby, we’ve created a system that fails to take care of U.S. mothers further.

3. The U.S. has no national mandate for paid parental leave.

In another disappointing distinction, the United States remains just about the only industrialized country, and the only OECD member country, not to require paid parental leave. It’s up to the discretion of individual employers to decide whether or not to offer paid maternal or parental leave.

Governments in other countries provide weeks or even years with varying amounts of assured income to moms and, increasingly, dads. It’s a simple argument — as Christopher Ingram writes in the Washington Post:

At the risk of stating the obvious, having kids is a necessary condition for our biological and economic survival. The species must perpetuate itself, and at the country level, if economic growth is to continue, it behooves couples to churn out as many future employees and taxpayers as possible.

Not only is it more dangerous and more expensive to have a child in the United States, but even after that, we’re not going to provide parents with the financial security to ensure they can take time to be with their newborn. With these facts, is it really that shocking that we have a declining birth rate?

Even if you manage to get past all of this, there’s an additional financial burden that becomes a major factor in keeping people from having kids.

4. Child care can cost as much as rent.

Care.com’s latest annual survey, which factors in parents who use nannies as well as those who opt for daycare, put the national average of child care expenses at $1,500 per month, or $18,000 per year. That’s equivalent to a national-average rent payment, which Rent Cafe put at $1,405.

The U.S. Department of Health and Human Services says that child care is considered “affordable” if it costs no more than 10 percent of a family’s income. Using Care.com’s average of $18,000, that’d require a combined household income of $180,000. But the latest figures from the Census put the median household income at $61,372, meaning child care for the average family is likely closer to nearly 30 percent.

Contributing almost 30 percent of your household income to child care is untenable for many multi-income families, much less single-parent households. Especially considering the state of most people’s personal finances.

5. Wages have remained nearly stagnant for 40 years.

Pew Research found that “today’s real average wage (that is, the wage after accounting for inflation) has about the same purchasing power it did 40 years ago. And what wage gains there have been have mostly flowed to the highest-paid tier of workers.”

This means that as the cost of living continues to skyrocket, wages aren’t keeping pace for most of us. This makes it all that much harder to afford both quality health care and quality child care. Plus, while wages have barely budged, our debt is crushing us.

6. We are struggling to make ends meet even before having children.

Our level of debt — which for many young people today is student loan debt — drastically outpaces any other generation before us, according to the Federal Reserve. Its most recent study flat out states that “millennials are less financially well-off than members of earlier generations when they were the same ages, with ‘lower earnings, fewer assets and less wealth.’”

If high maternal mortality rates, the high cost of having a baby plus the lack of guaranteed income after having that baby, and child care that costs a significant percentage of already stagnant wages for a generation that’s openly worse off financially than previous ones isn’t enough to evidence as to why the birth rate is decreasing, there’s one more factor to bring it home.

7. We’re disillusioned and burned out.

If you haven’t read Anne Helen Peterson’s brilliant piece on how millennials are the burnout generation, you really should. She explains how our parents raised us in relatively stable economic and political times and reared us with an eye to our hard-working futures with the belief we’d be even better off than they were. From purposeful play time to post-graduate school expectations, we have, as Peterson writes, “internalized the idea that [we] should be working all the time.” But we very rarely reach the dangling carrot. Saddled with the nation’s financial crisis on top of our own mountains of debt and the idea that we are failures if we stop, we are — quite simply — exhausted.

Now, I’m not saying that no one should have children. I have many friends who have amazing kids, and I love them. But if you’re going to act shocked at why the birth rate is decreasing, or worse, try to analyze it while ignoring all of the above realities, then you’re doing everyone a disservice.

Originally posted on Medium.

Continue reading Repost: Everyone’s Missing the Obvious About the Declining U.S. Birth Rate

Poem: CHAINED TO A BUBBLE

They say London is a bubble

A cosmopolitan melting hub of cultures, colours, tongues

And prosperity.

But something feels wrong

An intangible crown of barbed wire wrapped round my head

An intangible coat of shackles round my torso

Cleverly leaving limbs free to move

To make me think I’m getting somewhere

To make me think I’m handling shit

But I can see the signs.

This empire is imploding,

Every new inch of virtual ground it gains it hoists itself toward its own end

A black hole whose epicentre is so strong good & evil are forced to

Conjugate

Procreate

Transcribe

Translate

Then replicate

Propagate

Monopolise

Indoctrinate

Into a whole new species of social values.

These chains of

Pessimism,

Fear of terrorism,

Commercialised capitalist Christendom,

Moaning about our schizo weather,

Resigned to the rise in loneliness, depression, suicide and DV,

Disgust of pigeons and uneasy tolerance of foreigners who’ve been here longer than them,

McDs, Nando’s, Starbucks and chicken & chips,

Mortgages, taxes, credit card debts and mis-sold PPIs,

NHS doing less and less to serve the nation’s health,

High-rise flats and homelessness increasing simultaneously,

Under-age geniuses educated on a curriculum

Where social skills are ancient history

As new technologies march in to talk for them,

The still-not-finished Brexit deals

And football.

But they’re rusting

Trying to gloss it over with another worst economic downturn since records began

It doesn’t fool me anymore

The blackness calls,

The chaos invisibly chameleonically shape-shifting the borders of space and time

What goes up must come down

This time it’s not just London Bridge

Fossilised winds of pseudo-monocultural Britishness

Are blowing change into parched lungs again

It can’t be stopped.

Regardless of whose beliefs, lifestyle or hegemonic socio-economic policies it hurts,

Regardless of how far the human race wishes to overtake the borders of its origin planet,

Regardless of how badly they – & we – need this system to continue

Because it’s our proof of man’s superiority over nature herself,

Progress breathes on.

Time to loose the chains,

Let the bubble implode,

Feel the level playing field on which we really stand,

Remember the lessons that global domination taught us,

Re-nourish our spiritual evolution in love

Of the world,

Other creatures,

And ourselves.

 

© One Tawny Stranger, January 2018

Don’t tell me you haven’t heard about it…

black-panther-hr-poster.jpg

WARNING: SPOILER ALERT!!!

Last Saturday night (17/2/2018) I and some friends went to watch the new Black Panther movie. Admittedly I’d had some reservations, primarily because Continue reading Don’t tell me you haven’t heard about it…

Cheddar Man: DNA shows early Briton had dark skin

(Reposted from: https://www.bbc.co.uk/news/amp/science-environment-42939192)

Cheddar Man: DNA shows early Briton had dark skin

  •  

DNA shows early Brit had dark skin
Image caption DNA shows early Brit had dark skin

A cutting-edge scientific analysis shows that a Briton from 10,000 years ago had dark brown skin and blue eyes.

Researchers from London’s Natural History Museum extracted DNA from Cheddar Man, Britain’s oldest complete skeleton, which was discovered in 1903.

University College London researchers then used the subsequent genome analysis for a facial reconstruction.

It underlines the fact that the lighter skin characteristic of modern Europeans is a relatively recent phenomenon.

No prehistoric Briton of this age had previously had their genome analysed.

As such, the analysis provides valuable new insights into the first people to resettle Britain after the last Ice Age.

The analysis of Cheddar Man’s genome – the “blueprint” for a human, contained in the nuclei of our cells – will be published in a journal, and will also feature in the upcoming Channel 4 documentary The First Brit, Secrets Of The 10,000-year-old Man.

‘Cheddar George’ tweet on early Briton

Cheddar Man’s remains had been unearthed 115 years ago in Gough’s Cave, located in Somerset’s Cheddar Gorge. Subsequent examination has shown that the man was short by today’s standards – about 5ft 5in – and probably died in his early 20s.

Prof Chris Stringer, the museum’s research leader in human origins, said: “I’ve been studying the skeleton of Cheddar Man for about 40 years

“So to come face-to-face with what this guy could have looked like – and that striking combination of the hair, the face, the eye colour and that dark skin: something a few years ago we couldn’t have imagined and yet that’s what the scientific data show.”

Cheddar Man
Image captionA replica of Cheddar Man’s skeleton now lies in Gough’s Cave

Fractures on the surface of the skull suggest he may even have met his demise in a violent manner. It’s not known how he came to lie in the cave, but it’s possible he was placed there by others in his tribe.

The Natural History Museum researchers extracted the DNA from part of the skull near the ear known as the petrous. At first, project scientists Prof Ian Barnes and Dr Selina Brace weren’t sure if they’d get any DNA at all from the remains.

But they were in luck: not only was DNA preserved, but Cheddar Man has since yielded the highest coverage (a measure of the sequencing accuracy) for a genome from this period of European prehistory – known as the Mesolithic, or Middle Stone Age.

They teamed up with researchers at University College London (UCL) to analyse the results, including gene variants associated with hair, eye and skin colour.

Extra mature Cheddar

They found the Stone Age Briton had dark hair – with a small probability that it was curlier than average – blue eyes and skin that was probably dark brown or black in tone.

This combination might appear striking to us today, but it was a common appearance in western Europe during this period.

Steven Clarke, director of the Channel Four documentary, said: “I think we all know we live in times where we are unusually preoccupied with skin pigmentation.”

Prof Mark Thomas, a geneticist from UCL, said: “It becomes a part of our understanding, I think that would be a much, much better thing. I think it would be good if people lodge it in their heads, and it becomes a little part of their knowledge.”

Unsurprisingly, the findings have generated lots of interest on social media.

Cheddar Man’s genome reveals he was closely related to other Mesolithic individuals – so-called Western Hunter-Gatherers – who have been analysed from Spain, Luxembourg and Hungary.

Dutch artists Alfons and Adrie Kennis, specialists in palaeontological model-making, took the genetic findings and combined them with physical measurements from scans of the skull. The result was a strikingly lifelike reconstruction of a face from our distant past.

Pale skin probably arrived in Britain with a migration of people from the Middle East around 6,000 years ago. This population had pale skin and brown eyes and absorbed populations like the ones Cheddar Man belonged to.

Chris Stringer
Image caption Prof Chris Stringer had studied Cheddar Man for 40 years – but was struck by the Kennis brothers’ reconstruction

No-one’s entirely sure why pale skin evolved in these farmers, but their cereal-based diet was probably deficient in Vitamin D. This would have required agriculturalists to absorb this essential nutrient from sunlight through their skin.

“There may be other factors that are causing lower skin pigmentation over time in the last 10,000 years. But that’s the big explanation that most scientists turn to,” said Prof Thomas.

Boom and bust

The genomic results also suggest Cheddar Man could not drink milk as an adult. This ability only spread much later, after the onset of the Bronze Age.

Present-day Europeans owe on average 10% of their ancestry to Mesolithic hunters like Cheddar Man.

Britain has been something of a boom-and-bust story for humans over the last million-or-so years. Modern humans were here as early as 40,000 years ago, but a period of extreme cold known as the Last Glacial Maximum drove them out some 10,000 years later.

There’s evidence from Gough’s Cave that hunter-gatherers ventured back around 15,000 years ago, establishing a temporary presence when the climate briefly improved. However, they were soon sent packing by another cold snap. Cut marks on the bones suggest these people cannibalised their dead – perhaps as part of ritual practices.

Image copyright CHANNEL 4Ian Barnes
Image caption The actual skull of Cheddar Man is kept in the Natural History Museum, seen being handled here by Ian Barnes

Britain was once again settled 11,000 years ago; and has been inhabited ever since. Cheddar Man was part of this wave of migrants, who walked across a landmass called Doggerland that, in those days, connected Britain to mainland Europe. This makes him the oldest known Briton with a direct connection to people living here today.

This is not the first attempt to analyse DNA from the Cheddar Man. In the late 1990s, Oxford University geneticist Brian Sykes sequenced mitochondrial DNA from one of Cheddar Man’s molars.

Mitochondrial DNA comes from the biological “batteries” within our cells and is passed down exclusively from a mother to her children.

Prof Sykes compared the ancient genetic information with DNA from 20 living residents of Cheddar village and found two matches – including history teacher Adrian Targett, who became closely connected with the discovery. The result is consistent with the approximately 10% of Europeans who share the same mitochondrial DNA type.

Follow Paul on Twitter.

 

‘Never get high on your own supply’ – why social media bosses don’t use social media

Reposted from: https://www.theguardian.com/media/2018/jan/23/never-get-high-on-your-own-supply-why-social-media-bosses-dont-use-social-media

Developers of platforms such as Facebook have admitted that they were designed to be addictive. Should we be following the executives’ example and going cold turkey – and is it even possible for mere mortals?

 

by 

Mark Zuckerberg doesn’t use Facebook like you or me. The 33-year-old chief executive has a team of 12 moderators dedicated to deleting comments and spam from his page, according to Bloomberg. He has a “handful” of employees who help him write his posts and speeches and a number of professional photographers who take perfectly stage-managed pictures of him meeting veterans in Kentuckysmall-business owners in Missouri or cheesesteak vendors in Philadelphia.

Facebook’s locked-down nature means mere mortals can’t see the private posts on Zuckerberg’s timeline, but it is hard to imagine him getting into arguments about a racist relative’s post of an anti-immigration meme. And it is not just Zuckerberg. None of the company’s key executives has a “normal” Facebook presence. You can’t add them as friends, they rarely post publicly and they keep private some information that the platform suggests be made public by default, such as the number of friends they have.

Over at Twitter, the story is the same. f the company’s nine most senior executives, only four tweet more than once a day on average. Ned Segal, its chief financial officer, has been on the site for more than six years and has sent fewer than two tweets a month. Co-founder Jack Dorsey, a relatively prolific tweeter, has sent about 23,000 since the site was launched, but that is a lot less than even halfway engaged users have sent over the same period. Dorsey rarely replies to strangers and avoids discussions or arguments on the site. He doesn’t live-tweet TV shows or sporting fixtures. In fact, he doesn’t really “use” Twitter; he just posts on it occasionally.

I am a compulsive social media user. I have sent about 140,000 tweets since I joined Twitter in April 2007 – six Jacks’ worth. I use Instagram, Snapchat and Reddit daily. I have accounts on Ello, Peach and Mastodon (remember them? No? Don’t worry). Three years ago, I managed to quit Facebook. I went cold turkey, deleting my account in a moment of lucidity about how it made me feel and act. I have never regretted it, but I haven’t been able to pull the same stunt twice.

I used to look at the heads of the social networks and get annoyed that they didn’t understand their own sites. Regular users encounter bugs, abuse or bad design decisions that the executives could never understand without using the sites themselves. How, I would wonder, could they build the best service possible if they didn’t use their networks like normal people?

Now, I wonder something else: what do they know that we don’t?

Sean Parker, the founding president of Facebook, broke the omertà in October last year, telling a conference in Philadelphia that he was “something of a conscientious objector” to social media.

“The thought process that went into building these applications, Facebook being the first of them … was all about: ‘How do we consume as much of your time and conscious attention as possible?’ That means that we need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that’s going to get you to contribute more content and that’s going to get you … more likes and comments,” he said.

“It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology. The inventors, creators – me, Mark [Zuckerberg], Kevin Systrom on Instagram, all of these people – understood this consciously. And we did it anyway.”

A month later, Parker was joined by another Facebook objector, former vice-president for user growth Chamath Palihapitiya. “The short-term, dopamine-driven feedback loops that we have created are destroying how society works. No civil discourse, no cooperation; misinformation, mistruth,” Palihapitiya said at a conference in Stanford, California. “This is not about Russian ads. This is a global problem. It is eroding the core foundations of how people behave by and between each other. I can control my decision, which is that I don’t use that shit. I can control my kids’ decisions, which is that they’re not allowed to use that shit.”

Palihapitiya’s statements rattled Facebook so much that the company issued a response acknowledging its past failings – a rare move for a business that, despite its mission to “connect people”, is notoriously taciturn about its shortcomings. “When Chamath was at Facebook, we were focused on building new social media experiences and growing Facebook around the world,” a company spokeswoman said. “Facebook was a very different company back then … as we have grown, we have realised how our responsibilities have grown, too. We take our role very seriously and we are working hard to improve.”

A few days later, the site pulled a more interesting move, releasing the results of research that suggested that Facebook did make users feel bad – but only if they didn’t post enough. “In general, when people spend a lot of time passively consuming information – reading, but not interacting with people – they report feeling worse afterward,” two Facebook researchers said in a review of the existing literature. On the other hand, “actively interacting with people – especially sharing messages, posts and comments with close friends and reminiscing about past interactions – is linked to improvements in wellbeing”. How convenient.

For Adam Alter, a psychologist and the author of Irresistible, an examination of technology addiction, it is almost beside the point whether social media makes you happy or sad in the short term. The deeper issue is that your usage is compulsive – or even addictive.

“The addiction concept applies much more broadly and to many more behaviours than we perhaps thought and also therefore applies to many more people in the population,” Alter says. “Roughly half the adult population has at least one behavioural addiction. Not many of us have substance addictions, but the way the world works today there are many, many behaviours that are hard for us to resist and a lot of us develop self-undermining attachments to those behaviours that border on or become addictions.”

Social media taking over people's lives
 Our addictions to social media ‘haven’t happened accidentally’, says psychologist Adam Alter. Illustration: Jason Ford for the Guardian

These addictions haven’t happened accidentally, Alter argues. Instead, they are a direct result of the intention of companies such as Facebook and Twitter to build “sticky” products, ones that we want to come back to over and over again. “The companies that are producing these products, the very large tech companies in particular, are producing them with the intent to hook. They’re doing their very best to ensure not that our wellbeing is preserved, but that we spend as much time on their products and on their programs and apps as possible. That’s their key goal: it’s not to make a product that people enjoy and therefore becomes profitable, but rather to make a product that people can’t stop using and therefore becomes profitable.

“What Parker and Palihapitiya are saying is that these companies, companies that they’ve been exposed to at the highest levels and from very early on, have been founded on these principles – that we should do everything we possibly can to hack human psychology, to understand what it is that keeps humans engaged and to use those techniques not to maximise wellbeing, but to maximise engagement. And that’s explicitly what they do.”

Parker and Palihapitiya aren’t the only Silicon Valley residents to open up about their unease with the habit-forming nature of modern technology. As the Guardian reported in October, a growing number of coders and designers are quitting their jobs in disillusionment at what their work entails. From Chris Marcellino – one of the inventors of Apple’s system for push notifications, who quit the industry to train as a neurosurgeon – to Loren Britcher – who created the pull-to-refresh motion that turns so many apps into miniature one-armed-bandits and is now devoting his time to building a house in New Jersey – many of the workers at the coalface of interface design have had second thoughts.

Others have had the same realisation, but have decided to embrace the awkwardness – such as LA-based retention consultants Dopamine Labs. The company offers a plugin service that personalises “moments of joy” in apps that use it. It promises customers: “Your users will crave it. And they’ll crave you.”

If this is the case, then social media executives are simply following the rule of pushers and dealers everywhere, the fourth of the Notorious BIG’s Ten Crack Commandments: “Never get high on your own supply.”

“Many tech titans are very, very careful about how they privately use tech and how they allow their kids to use it and the extent to which they allow their kids access to screens and various apps and programs,” says Alter. “They will get up on stage, some of them, and say things like: ‘This is the greatest product of all time,’ but then when you delve you see they don’t allow their kids access to that same product.”

Last week, Apple’s chief executive, Tim Cook, told the Guardian: “I don’t have a kid, but I have a nephew that I put some boundaries on. There are some things that I won’t allow. I don’t want them on a social network.”

He added: “Technology by itself doesn’t want to be good and it doesn’t want to be bad either. It takes humans to make sure that the things that you do with it are good. And it takes humans in the development process to make sure the creation of the product is a good thing.”

Alter says that the classic example of this approach is Cook’s predecessor, Steve Jobs, “who spoke about all the virtues of the iPad and then wouldn’t let his kids near it”. (“They haven’t used it,” Jobs told a New York Times reporter a few months after the iPad was released. “We limit how much technology our kids use at home.”)

It is not only children. “You can see it in their own behaviour,” Alter says. “Jack Dorsey, the way he uses Twitter, it seems he’s very careful about how much time he spends. He’s obviously a very busy guy and a very high-functioning guy, but as a result he’s probably distracted by a lot of other things and he’s able to tear himself away from the platform.

“But that’s not true for all of the users of Twitter – many of them report being, using the colloquial term, addicted. Whether or not that’s clinical addiction, it feels to them like they would like to be doing less; it’s undermining their wellbeing. And I think that’s absolutely right: for many Twitter users, it’s sort of a black hole that sucks you in and it’s very hard to stop using the program.”

That is certainly how I feel about Twitter. I have tried to cut back, after realising how much of my time was spent staring at a scrolling feed of aphorisms ranging from mildly amusing to vaguely traumatic. I deleted 133,000 tweets, in an effort to reduce the feeling that I couldn’t give up on something into which I had sunk so much time. I removed the apps from my phone and my computer, forcing any interaction through the web browser. I have taken repeated breaks. But I keep coming back.

It is one thing to be a child with a protective parent keeping technology away from you. It is quite another to live like a technology executive yourself, defeating the combined effort of thousands of the world’s smartest people to instil a craving to open their app every day. I am not alone in struggling.

Kevin Holesh, a freelance app developer, is one of those who tried to cut back. He wrote a program, Moment, that tracks how long you spend looking at your phone each day. For the average user, it is more than three hours every day. Holesh’s stats were enough to provide the motivation for change. “Once I had the hard data, that itself was helping me use my phone less. I’ve taken a few steps in that direction since, but I knew just seeing the number itself was half the battle. Seeing that number really changed my approach … I was spending an hour a day not doing anything productive, just wasting time.”

Holesh eventually removed all social networks, and his work email account, from his phone. “That’s the step that helps me the most, simply not having it accessible. At first, my mission was just: find out what amount of phone time makes you happy. But now I’ve started a little more extreme approach, I’m less stressed out about news articles or my uncle posting something inflammatory on Facebook. I find I do a better job of communicating in more old-fashioned methods.”

Alter says willpower can help to a certain extent, while leaving things out of reach for casual, thoughtless use can help more. Ultimately, however, addictions are hard to break alone.

“It’s possible that in 20 years we’ll look back at the current generation of children and say: ‘Look, they are socially different from every other generation of humans that came before and as a result this is a huge problem and maybe we need to regulate these behaviours.’ Or perhaps we’ll look back and say: ‘I don’t know what the fuss was – I’m not sure why we were so concerned.’ Until we have some evidence, until there’s something that seems tangible, I think it’s going to be very hard to get people en masse to change how they behave.”

If you can’t bring yourself to cut back on social media, you could try following Zuckerberg’s example and hire a team of 12 to do it for you. It might not be as cheap and easy as deleting Facebook, but it is probably easier to stick to.

quirky actor, script & story writer and poet spreading insights, old and new, from unconventional sources

Uncia

Randomly rhyming words, a few random thoughts, and an empath's emotional rollercoaster. In other words; Ramblings, Poetry, Soul-Food, Haiku, Narrative, Poems, Life, Transcend, Snow-leopard, Spoken word

aswathi thomas

a look into the mind of a crazy indian girl

science.casual

All science, mostly casual.

Humanist Association of Ghana

Challenging superstition in the pursuit of human dignity and compassion

%d bloggers like this: