"In the age of artificial intelligence, the answer to a more optimistic future may lie in redefining work itself.
WORK, as an idea, is both familiar and frustratingly abstract. We go to work, we finish our work, we work at something. It’s a place, an entity, tasks to be done or output to achieve. It’s how we spend our time and expend our mental and physical resources. It’s something to pay the bills, or something that defines us. But what, really is work? And from a company’s perspective, what is the work that needs to be done? In an age of artificial intelligence, that’s not merely a philosophical question. If we can creatively answer it, we have the potential to create incredible value. And, paradoxically, these gains could come from people, not from new technology."
"The presence of a westward-moving frontier of settlement shaped early U.S. history. In 1893, the historian Frederick Jackson Turner famously argued that the American frontier fostered individualism. We investigate the Frontier Thesis and identify its long-run implications for culture and politics. We track the frontier throughout the 1790–1890 period and construct a novel, county-level measure of total frontier experience (TFE). Historically, frontier locations had distinctive demographics and greater individualism. Long after the closing of the frontier, counties with greater TFE exhibit more pervasive individualism and opposition to redistribution. This pattern cuts across known divides in the U.S., including urban–rural and north–south. We provide suggestive evidence on the roots of frontier culture: selective migration, an adaptive advantage of self-reliance, and perceived opportunities for upward mobility through effort. Overall, our findings shed new light on the frontier’s persistent legacy of rugged individualism." [Abstract]
How we see the world influences the real or perceived needs that inform our intentions. If I see the world as a place dominated by fierce competition for limited resources, I will fight others to get…
"There’s no room for argument about whether hate-filled internet message boards encourage real-world violence: they do, and none more so than 8chan. It normalises racism, misogyny, and extremism – and helps turn nightmarish, loud-mouthed talk of action into reality.” —Destroyer of Worlds
This examination of the 8chan online community shows how anonymity can breed a very dark social structure that is impossible to control, even for the founder. It seems that even if this community was shut down, a new one will be created, as evidenced by the rapid migration of the Gamergate harassment group from 4chan to 8chan. The disruption of civil society becomes the raison d’être of these types of communities.
The global economy is in crisis. The exponential exhaustion of natural resources, declining productivity, slow growth, rising unemployment, and steep inequal...
The 4th Industrial Revolution (4IR) is a way of describing the blurring of boundaries between the physical, digital, and biological worlds. It’s a fusion of advances in artificial intelligence (AI), robotics, the Internet of Things (IoT), 3D printing, genetic engineering, quantum computing, and other technologies.
No kid ever dreamed of growing up and driving for Uber or styling for Stitch Fix. In part, that’s because none of those companies existed when most of today’s adults were young. It’s also because, besides its much-touted “flexibility,” the gig economy isn’t much of a place to build a career. Instead, over the course of less than a decade, the self-described “tech companies” that connect people to gig work have managed to erode hard-fought labor protections in place for a century.
In Hustle and Gig, to be published in March by University of California Press, sociologist Alexandrea Ravenelle interviews 80 gig workers who are struggling, striving, and succeeding. She analyzes their stories in the context of US employment history and concludes that “for all its app-enabled modernity, the gig economy resembles the early industrial age…the sharing economy is truly a movement forward to the past.”
As Millennials enter their early-30s, the focus is now shifting to Generation Z - a group that is just starting to enter the workforce for the first time. Generation Z does not remember a time when the internet did not exist – and as such, it’s not surprising to learn that 50% of Gen Z spends 10 hours a day connected online, and 70% watches YouTube for two hours a day or more.
But put aside this ultra-connectivity, and Gen Zers have some unique and possibly unexpected traits. Gen Z prefers face-to-face interactions in the workplace, and also expects to work harder than past groups. Gen Z is also the most diverse generation (49% non-white) and values racial equality as a top issue. Finally, Gen Z is possibly one of the most practical generations, valuing things like saving money and getting stable jobs.
For the purposes of this site, the history of human interaction with information may be divided into 4 eras. The first (spoken) era ended with the invention of writing around 3000-4000 BC. The second era ended with the invention of the printing press in 1440. The third era ended, and the fourth began, with the invention of the Internet (depending how one defines its operational beginning) somewhere between 1969 and 1982. We now exist early, but decidedly, in the fourth era.
All readers may not agree with this interpretation of the history of information, especially with the division and numbering of the eras. That is not the main point. Rather, it is that humankind presently exists in an era distinctly different from the one that preceded it -- that in fact, this new era is accompanied with, and characterized by, a new - and quite different - information landscape. This new Internet information landscape will challenge, disrupt, and overpower the print-oriented one that came before it. It will not completely obliterate that which preceded it, but it will render it to a subsidiary, rather than primary, level of influence.
Just as the printing press altered humanity's relationship with information, thereby resulting in massive restructuring of political, religious, economic, social, educational, cultural, scientific, and other realms of life; so too will the advance of digital technology occasion analogous transformations in the corresponding universe of present and future human activity.
This site will concern itself primarily with how K-20 education in the US, and the people who comprise its constituencies, may be affected by this transformative movement from one era to the next. All ideas considered here appear, to me at least, to impact the learning enterprise in some way. Accordingly, this work looks at the present and the future through a lens that is predominantly, but far from entirely, a digital one. -JL
Opinions expressed, scooped, or copied in this Scoop.it topic are my own, or a result of my own judgment, and should in no way be understood to reflect those of my employer.
The modern West has placed a high premium on the value of equality. Equal rights are enshrined in law while old hierarchies of nobility and social class have been challenged, if not completely dismantled. Few would doubt that global society is all the better for these changes. But hierarchies have not disappeared. Society is still stratified according to wealth and status in myriad ways.
On the other hand, the idea of a purely egalitarian world in which there are no hierarchies at all would appear to be both unrealistic and unattractive. Nobody, on reflection, would want to eliminate all hierarchies, for we all benefit from the recognition that some people are more qualified than others to perform certain roles in society. We prefer to be treated by senior surgeons not medical students, get financial advice from professionals not interns. Good and permissible hierarchies are everywhere around us.
Yet hierarchy is an unfashionable thing to defend or to praise. British government ministers denounce experts as out of tune with popular feeling; both Donald Trump and Bernie Sanders built platforms on attacking Washington elites; economists are blamed for not predicting the 2008 crash; and even the best established practice of medical experts, such as childhood vaccinations, are treated with resistance and disbelief. We live in a time when no distinction is drawn between justified and useful hierarchies on the one hand, and self-interested, exploitative elites on the other.
We are in a time when continuous learning is becoming the accepted norm in life. We no longer get some schooling, find a job, and then work there until retirement.
This week I am in Florence having spent two days at “The Future of Education” conference. Visiting this city, which has played such a significant role in western history, is inspiring. It encourages one to not only look back at what was, but also to look ahead at what might be, especially when the t
The American system of higher education appears poised for disruptive change of potentially historic proportions due to massive new political,economic, and educational forces that threaten to undermine its business model, governmental support, and operating mission. These forces include dramatic new types of economic competition, difficulties in growing revenue streams as we had in the past, relative declines in philanthropic and government support, actual and likely future political attacks on universities, and some outdated methods of teaching and learning that have been unchanged for hundreds of years.
Most importantly, technological advances, the Internet,quantitative social science (recently known to the general public as “Big Data”), and the computer revolution have massively reinvented or disrupted travel, music, commerce, sports,newspapers, publishing, and many other information-based businesses. Is higher education next? Remember Newsweek?It was also in the business of creating and distributing knowledge. In 2010, the entire company was sold for $1.00 (Clark2010; Vega and Peters 2010).
Over the last 10 years, the number of family offices around the world has been significantly increased and this trend is expected to continue in the future. Family …
The 500 million-year-old fossils were discovered at Emu Bay Shale in South Australia's Kangaroo Island by researchers from the University of Adelaide, and the South Australian Museum.
Instead of going to college to get a job, students will increasingly be going to a job to get a college degree.
What does this mean exactly? Today, the #1 reason why Americans value and pursue higher education is “to get a good job.” The path has always been assumed as linear: first, go to college and then, get a good job. But what if there was a path to get a good job first – a job that comes with a college degree? In the near future, a substantial number of students (including many of the most talented) will go straight to work for employers that offer a good job along with a college degree and ultimately a path to a great career.
Many of us are doing real work six or even five hours each day, while spending eight-plus hours “at work”, without realising the loss of time spent with family, on hobbies, exercise or energising side projects.
Instead of getting a full-day’s work accomplished, we’re losing productive time to interruptions from co-workers, needless meetings, and especially our perpetually pinging phones. The result is frustration, depression and lost opportunity for workers, and billions lost in profits for our employers.
According to an incisive report issued by online learning platform Udemy, the workplace is filled with distracted, often frustrated employees who aren’t achieving their potential, are simultaneously depressed about it, but unsure what to do.
In today’s rapidly changing market, organizations and workforce experts alike are trying to determine what the future of work will look like. However, most professionals involved in the process operate from the perspective of their individual scope of responsibility, which often leads to siloed perspectives that may solve one aspect of the challenge—yet create another problem.
Based on conversations across the market and our in-depth research, KellyOCG has determined that there are four dimensions organizations need to consider when contemplating the “future of work”: the workforce, the workplace, technology, and social norms. By taking these four dimensions into account, organizations can better gain a comprehensive overview of the range of models they may want to utilize to engage and execute within their workforce plan. Further, they can assess which combination of engagement models will drive the business outcomes they want to achieve.
Entitled “The Ego Revolution at Work”, the new book by Denis Pennel emphases the demise of a dominant work organisation model (Fordism) and the emergence of new forms of work (such as uberisation, human cloud, self-employment,).
Today’s labour markets are characterised by the rise of a dispersed workforce and increasing working time flexibility. In this new environment the needs of individuals and business are changing rapidly: companies can no longer offer the security of “a job for life” and individuals want more freedom of choice and expect to work the way they live!
The way businesses and individuals think about employment needs to change to accommodate this new working environment. The workplace must be aligned with today’s diverse workforce. Social benefits must become portable to protect individuals’ rights whatever their status, and with these benefits no longer attached to an organisation. Employers must find new ways to attract and retain just-in-time workers and to engage with an extended workforce.
One of the great paradoxes of human endeavour is why so much time and effort is spent on creating things and indulging in behaviour with no obvious survival value – behaviour otherwise known as art.
Attempting to shed light on this issue is problematic because first we must define precisely what art is. We can start by looking at how art, or the arts, were practised by early humans during the Upper Palaeolithic period, 40,000 to 12,000 years ago, and immediately thereafter.
This period is a far longer stretch of human history than the “modern” age and so how the arts were practised during it should serve as the starting point for any viable explanation. And while art in the modern world is often exploited as a means of expressing individualism, during most of cultural evolution it was utilised by small hunter-gatherer groups as a means of articulating social norms among most, if not all, members of a community. The arts are special
Why should individuals engage in a preoccupation that requires significant effort, effort that could be better directed towards more immediately gainful activities, such as the search for food or other vital resources? One clue comes from the fact that art objects have special resonance because they come into being through human agency. This involves considerable emotional investment and, consequently, art acts as a crucial node in the complex web of things that make up a culture.
The time and effort committed to making art suggests such behaviour may be a means of signalling to other members of a group. Paradoxically, the very fact that art remains inscrutable and has little obvious practical value is precisely what makes it important for assessing whether a person making art can be regarded as a trustworthy member of a group. In short, art provided a “costly signal” (altruistic behaviour that indirectly benefits the individual by establishing a reputation) for monitoring group allegiance and managing a trust network that weeded out freeloaders. When combined with ritual, which is often the case, art becomes an even more potent symbol. The notion that it can act as a vehicle for costly signalling is bolstered by the fact that art objects were regularly destroyed or defaced soon after being produced. This suggests that it was the process of making, rather than the final product, that was most significant.
To get content containing either thought or leadership enter:
To get content containing both thought and leadership enter:
To get content containing the expression thought leadership enter:
You can enter several keywords and you can refine them whenever you want. Our suggestion engine uses more signals but entering a few keywords here will rapidly give you great content to curate.