Analogies of the human brain as a sort of organic computer (or visa versa, in the case of artificial intelligence) abounds. But it remains a matter of debate among some circles as to whether a truly thinking machine is feasible, and if so, whether it would operate along the same lines as our biological brains.
Love them or hate them, selfies have become something of an icon of the 21st century. Considered the ultimate expression of narcissism and irreverence — especially among the already much-criticized Millennial generation most likely to take them — selfies instead reflect something much deeper and more fascinating about the state of humanity.
I know, it might be hard to believe given how vacuous selfiest seem, but Nicholas Mirzoeff of The Guardian makes a pretty compelling case about the sociological and cultural impact of selfiest and digital media in general. Continue reading
As the Information Age continues to yield exponentially more powerful computers and processors, the idea of artificial intelligence will become increasingly more relevant and serious in the coming decades.
But some thinkers saw this coming well in advance, namely American mathematician and inventor Marvin Minsky, who pass away earlier this week at the age of 88.
As the Christian Science Monitor reports, this otherwise obscure figure (outside of the scientific and academic community) was a major intellectual contributor to A.I., laying its conceptual, practical, and ethical groundwork. Continue reading
Since I find myself (fortunately) busy with some well needed freelance work, I have decided to keep things a bit light today; if you are similarly fascinated by humanity’s boundless capacity for innovation and grandiosity, check out Popular Mechanics’ fascinating list of some of the world’s largest and technically-challenges projects under construction.
From near-stratospheric skyscrapers, to valley-spanning bridges and even whole cities, these infrastructural marvels reflect the latest developments in both technology and human vision — to say nothing of the endless appetite for economic growth and global prestige alike.
It is very telling that most of these projects take place in the developing world, particularly China, though quite a few are being undertaken in the industrialized world, including the United States. A more cynical and cautious observer might worry about the environmental impact of these endeavors, or whether they are a good use of funds in light of the global economic slowdown; such concerns are well founded, though for now I am content to see what technological feats our species is capable of, and how the fruits of such projects — if any — will bear out in the coming years.
With the successful Astrosat, a cutting-edge space observatory, the Indian Space Research Organisation (ISRO) has put India among a select group of countries that have an independently designed and operate a space telescope studying celestial objects. As The Hindu reports:
The ability to simultaneously study a wide variety of wavelengths — visible light, ultraviolet and X-ray (both low- and high-energy) bands — has tremendous implications for scientists globally, particularly those in India. Though stars and galaxies emit in multiple wavebands, currently operational satellites have instruments that are capable of observing only a narrow range of wavelength band. Since the Earth’s atmosphere does not allow X-rays and ultraviolet energy from distant cosmic objects to reach ground-based telescopes, space observatories become important to unravel celestial mysteries. With Astrosat, Indian researchers will no longer have to rely on other space agencies for X-ray data, and scientists everywhere need no longer source data from more than one satellite to get a fuller picture of celestial processes. As in the case of Chandrayaan-1 and the Mars Orbiter Mission, Mangalyaan, the Astrosat telescope will have no immediate commercial or societal implications. But the instruments have been carefully chosen to allow scientists to pursue cutting-edge research. Chandrayaan-1 and Mangalyaan returned invaluable information, although they were launched several years after other countries sent satellites to the Moon and Mars. Given the uniqueness of Astrosat, it will enable Indian researchers to work in the frontier areas of high-energy astrophysics.
Moreover, most of the payloads in the satellite come not from ISRO, but from a range of scientific institutions across India and the world (including Indonesia, Canada, and the United States). Astrostat thus reflects the country’s wide breadth of native talent, as well as its capacity to combine and coordinate these vital skills into one platform that can benefit researcher everywhere. Continue reading
Originality is overrated. Yes, novel ideas have often accounted for tremendous advancements in human knowledge and conditions; but as Kat McGowan of Aeon writes, the ability to copy one another, and make incremental improvements along the way, has been much more consequential.
The history of technology shows that advances happen largely through tinkering, when somebody recreates a good thing with a minor upgrade that makes it slightly better. These humble improvements accrue over generations, so that the Bronze Age straight pin becomes a toga fastener becomes a safety pin. Money begins as seashells, evolves into metal coins, diversifies as paper, and eventually becomes virtual as bitcoins and abstruse financial derivatives. In this way, technologies arise that no one person could possibly invent on his own. When Isaac Newton talked about standing on the shoulders of giants, he should have said that we are dwarves, standing atop a vast heap of dwarves.
Researchers dub this iterative process ‘cumulative cultural evolution’: just as organisms evolve via repeated small changes in genes that provide a survival advantage, each human generation makes small modifications to the technology and traditions it inherits. This idea is most clearly articulated by the anthropologist Robert Boyd, of the Santa Fe Institute and Arizona State University, and the biologist and mathematical modeller Peter Richerson, of the University of California Davis. ‘When lots of imitation is mixed with a little bit of individual learning, populations can adapt in ways that outreach the abilities of any individual genius,’ they write in their book Not By Genes Alone(2005).
Lots of copying means that many minds get their chance at the problem; imitation ‘makes the contents of brains available to everyone’, writes the developmental psychologist Michael Tomasello in the Cultural Origins of Human Cognition (1999). Tomasello, who is co-director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, calls the combination of imitation and innovation the ‘cultural ratchet’. It is like a mechanical ratchet that permits motion in only one direction – such as winding a watch, or walking through a turnstile. Good ideas push the ratchet forward one notch. Faithful imitation keeps the ratchet from slipping backward, protecting ideas from being forgotten or lost and keeping knowledge alive for the next round of improvement.
It turns out that creating something new is the easy part. What’s difficult – and what’s really important – is maintaining what we already know through copying. Luckily, we are very good at it.
In essence, human achievement at both the micro and macro level have been the result of multiple parties, often spanning generations and culture, having their go at an existing idea, invention, or concept. Progress is less about coming up with something immediately unique and earth-shattering, and more about looking around at what we know and how best to improve upon it.
Aside from giving clever and well-meaning imitators their due credit, the lesson here is that progress is a collective and collaborative effort, involving lots of contributors willing to do the humble and thankless work of tweaking what we already have, so that over time, with the help of other tinkerers, the world reaps the benefits.
This might be too much of a romantic take on what many would consider mere copying, but I think it reflects the inherent pragmatism of the human species: whether in art, science, or philosophy, go with what already seems to work and see where that gets you. Give it time, and who knows where that will get us.
As Cracked writer Mark Hill observes is his brilliant piece, Five Things You Learn When A Facebook Friend Dies, “We’re the first era of humanity that has had to deal with death and the Internet, and grief for the passing of someone you only knew online.”
I recently lost a good Facebook some weeks ago, and it was not my first experience. This article is on point, and I highly recommend you read it. As social media and the Internet as a whole become integrated into our everyday and emotional lives, the issues and feelings described in this piece will be increasingly common. It is important to reflect upon the implications of connecting with someone so distant in some respects, yet so close in many others.
As always, feel free to share your own thoughts and reactions to the article or the topic as a whole. I will have to dedicate another post with my own reflections on the subject later.
As the Internet rapidly becomes a global phenomenon — already accessed by about half of the world’s population — it is worth looking at which websites have developed a foothold in certain countries, particularly as more and more people continue to join the truly worldwide wide.
As one can plainly see, American Internet giants — namely Google and Facebook — dominate the global market. Yahoo! has surprisingly managed to maintain a hold in Japan, Taiwan, and (of all places) the African nation of Gabon.
This dominance is due partly to the competitive and technological edge of U.S. tech companies, and also because of the prevalence of the English language (particularly among the well off and educated people more likely to have Internet access; hence why one does not see a lot of Hindi or Swahili on the Web). Continue reading
Advances in technology, ranging from the 19th-century cotton gin to the latest cutting-edge robots, have long been cited as leading factors in the decline of both employment and quality of work. But a recent study from Deloitte, a major consultancy based in the U.K., has challenged this common narrative, arguing that on the contrary, technological innovations have created far more jobs — and far better lives — than are credited.
From The Guardian:
Their conclusion is unremittingly cheerful: rather than destroying jobs, technology has been a “great job-creating machine”. Findings by Deloitte such as a fourfold rise in bar staff since the 1950s or a surge in the number of hairdressers this century suggest to the authors that technology has increased spending power, therefore creating new demand and new jobs.
Their study, shortlisted for the Society of Business Economists’ Rybczynski prize, argues that the debate has been skewed towards the job-destroying effects of technological change, which are more easily observed than than its creative aspects.
Going back over past jobs figures paints a more balanced picture, say authors Ian Stewart, Debapratim De and Alex Cole.
“The dominant trend is of contracting employment in agriculture and manufacturing being more than offset by rapid growth in the caring, creative, technology and business services sectors”, they write.
“Machines will take on more repetitive and laborious tasks, but seem no closer to eliminating the need for human labour than at any time in the last 150 years”.
Citing a century-and-a-half of historical data from the U.K., the researchers found a precipitous decline in “hard, dull, and dangerous” work — such as agriculture and clothes washing — to less physically intensive jobs focused on “care, education and provision of services to others”. Continue reading
Telepathy promises an intimate connection to other human beings. If isolation, cruelty, malice, violence and wars are fuelled by misunderstandings and communication failures, as many people believe, telepathy would seem to offer the cure.
But findings from affective neuroscience, social psychology and the new neuroscientific study of empathy suggest that tapping directly into other people’s thoughts would be a pretty bad idea. In the past decade or so, this research has revealed that we already have deep insights into what other people feel and think. We really do have a sixth sense, but it’s psychological rather than psychic, made up of an entirely natural and completely human blend of emotional intuition and clever reasoning.
The more we know about empathy and ordinary human mind-reading, the less it looks like a way to achieve world peace. Technologically assisted telepathy could exaggerate flaws in our moral thinking and saddle us with unbearable intimacy, encouraging us to tune out the suffering of the most vulnerable. Emotional-mindreading is no guarantee of kindness; it is also how psychopaths and bullies manipulate and torment their victims. This research suggests an entirely sensible, completely ordinary, not-at-all-clairvoyant prediction about the future: rather than a dreamy bliss of togetherness, artificial telepathy would be a nightmare.
— Kat McGowan, “Can we harness telepathy for moral good?“
Read the rest of the Aeon article hyperlinked above, and share your own thoughts on the subject. What do you think about the merits of telepathic technology?