India’s OA policy: Learning from Ioannidis

India’s first Open Access policy was drafted by a committee affiliated with the Departments of Biotechnology and Science & Technology (DBT/DST) in early 2014. It hasn’t been implemented yet. Its first draft accepted comments on its form and function on the DBT website until July 25; the second draft was released last week and is open for comments until November 17, 2014. If it comes into effect, it could really expand the prevalence of healthy research practices in the Indian scientific community at a time when the rest of the world is handicapped by economies of scale and complexity to mandate their practice.

The policy aspires to set up a national Open Access repository, akin to PubMed for biomedical sciences and arXiv for physical sciences in the West, that will maintain copies of all research funded in part or in full by DBT/DST grants. And in the spirit of Open Access publishing, its contents will be fully accessible free of charge.

According to the policy, if a scientist applies for a grant, he/she must provide proof that previous research conducted with grants has been uploaded to the repository, and the respective grant IDs must be mentioned in the uploads. Moreover, the policy also requires institutions to set up their own institutional repositories, and asks that the contents of all institutional repositories be interoperable.

The benefits of such an installation are many and great. It would solve a host of problems that are starting to become more intricately interconnected and giving rise to a veritable Gordian knot of stakeholder dynamics. A relatively smaller research community in India can avoid this by implementing a few measures, including the policy.

For one, calls for restructuring the Indian academic hierarchy have already been made. Here, even university faculty appointments are not transparent. The promotion of scientists with mediocre research outputs to top administrative positions stifles better leaders who’ve gone unnoticed, and their protracted tenancy at the helm often stifles new initiatives. As a result, much of scientific research has become the handmaiden of defence research, if not profitability. In the biomedical sector, for example, stakeholders desire reproducible results to determine profitable drug targets but become loth to share data from subsequent stages of the product development cycle because of their investments.

There is also a bottleneck between laboratory prototyping and mass production in the physical sciences because private sector participation has been held at bay by concordats between Indian ministries. In fact, a DST report from 2013 concedes that the government would like to achieve 50-50 investment from private and public sectors only by 2017, while the global norm is already 66-34 in favour of private.

In fact, these concerns have been repeatedly raised by John Ioannidis, the epidemiologist whose landmark paper in 2005 about the unreliability of most published medical findings set off a wave of concern about the efficiency of scientific research worldwide. It criticized scientists’ favouring positive, impactful results even where none could exist in order to secure funding, etc. In doing so, however, they skewed medical literature to paint a more revolutionary picture than prevailed in real life, and wasted an estimated 85% of research resources in the process.

Ioannidis’s paper was provocative not because it proclaimed the uselessness of a lot of medical results but because it exposed the various mechanisms through which researchers could persuade the scientific method to yield more favourable ones.

He has a ‘sequel’ paper published on the 10th anniversary of the Open Access journal PLOS Med on October 19. In this, he goes beyond specific problems – such as small sample sizes, reliance on outdated statistical measures, flexibility in research design, etc. – to showcase what disorganized research can do to undermine itself. The narrative will help scientists and administrators alike design more efficient research methods, and so also help catalyse the broad-scale adoption of some practices that have until now been viewed as desirable only for this or that research area. For India, implementing its Open Access policy could be the first step in this direction.

Making published results – those funded in part or fully by DBT/DST grants – freely accessible has been known to engender practices like post-publication peer-review and sharing of data. Peer-review is the process of getting a paper vetted by a group of experts before publication in a journal. Doing that post-publication is to invite constructive criticism from a wider group of researchers as well as exposing the experimental procedures and statistical analyses. This in turn inculcates a culture of replication – where researchers repeat others’ experiments to see if they can reach the same conclusions – that reduces the prevalence of bias and makes scientific research as a whole more efficient.

Furthermore, requiring multiple institutional repositories to be interoperable will spur the development of standardised definitions and data-sharing protocols. It will also lend itself to effective data-mining for purposes of scientometrics and science communication. In fact, the text and metadata harvester described in the policy is already operational.

Registration of experiments, which is the practice of formally notifying an authority that you’re going to perform an experiment, is also a happy side-effect of having a national Open Access repository because it makes public funds more tractable, which Ioannidis emphasizes on. By declaring sources of funding, scientists automatically register their experiments. This could siphon as-yet invisible null and negative results to the surface.

A Stanford University research team reported in August 2014 that almost 67% of experiments (funded by the National Science Foundation, USA) that yielded null results don’t see the light of day while only 21% of those sent to journals are published. Contrarily, 96% of papers with strong, positive results are read and 62% are published. As a result, without prior registration of experiments, details of how public funds are used for research can be distorted, detrimental to a country that actually requires more oversight.

It is definitely foolish to assume one policy can be panacea. Ioannidis’s proposed interventions cover a range of problems in research practices, and they are all difficult to implement at once – even though they ought to be. But to have a part of the solution capable of reforming the evaluation system in ways considered beneficial for the credibility of scientific research but delaying its implementation will be more foolish. Even if the Open Access policy can’t acknowledge institutional nepotism or the hypocrisy of data-sharing in biomedical research, it provides an integrated mechanism to deal with the rest. It helps adopt common definitions and standards; promotes data-sharing and creates incentives for it; and emphasizes the delivery of reproducible results.

Posted in Policies & Economics, Psych of Science | Tagged , , , , , | Leave a comment

Second draft of India’s OA policy open for comments

The second draft of India’s first Open Access policy is up on the Department of Biotechnology (DBT) website. Until November 17, 2014, DBT Adviser Mr. Madhan Mohan will receive comments on the policy’s form and function, after which a course for implementation will be charted. The Bangalore-based Center for Internet and Society (CIS), a non-profit research unit, announced the update on its website while also highlighting some instructive differences between the first the second drafts of the policy.

The updated policy makes it clear that it isn’t concerned about tackling the academic community’s prevalent yet questionable reliance on quantitative metrics like impact-factors for evaluating scientists’ performance. Prof. Subbiah Arunachalam, one of the members of the committee that drafted the policy, had already said as much in August this year to this blogger.

The draft also says that it will not “underwrite article-processing charges” that some publishers charge to make articles available Open Access. The Elsevier Publishing group, which publishes 25 journals in India, has asked for a clarification on this.

Adhering to the policy’s mandates means scientists who have published a paper made possible by Departments of Biotechnology and Science & Technology should deposit that paper in an Open Access repository maintained either by the government or the institution they’re affiliated with.

They must do so within two weeks of the paper being accepted for publication. If the publisher has instituted an embargo period, then the paper will be made available on the repository after the embargo lifts. CIS, which advised the committee, has recommended that this period not exceed one year.

As of now, according to the draft, “Papers resulting from funds received from the fiscal year 2012-13 onwards are required to be deposited.” A footnote in the draft says that papers under embargo can still be viewed by individuals if the papers’ authors permit it.

The DBT repository is available here, and the DST repository here. All institutional repositories will be available as sub-domains on sciencecentral.in (e.g., xyz.sciencecentral.in), while the domain itself will lead to the text and metadata harvester.

The drafting committee also intends to inculcate a healthier Open Access culture in the country. It writes in the draft that “Every year each DBT and DST institute will celebrate “Open Access Day” during the International Open Access Week by organizing sensitizing lectures, programmes, workshops and taking new OA initiatives.”

Posted in Policies & Economics | Tagged , , , , , | Leave a comment

‘When you change something in a virus, you lose something else’

The contents of this blog post should have come out earlier (in a different form) but better late than never, eh? The Ebola outbreak has been more threatening than ever of going out of control (even as whether we’re really in control now is doubtful). As doctors and healthcare workers grappled with containment in West Africa, Michael Osterholm, the director of the Center for Infectious Diseases Research and Policy, University of Minnesota, wrote an alarmist opinion piece in The New York Times on September 11 that was more panic-mongering than instigatory. The thrust of Osterholm’s argument was:

The second possibility is one that virologists are loath to discuss openly but are definitely considering in private: that an Ebola virus could mutate to become transmissible through the air. … If certain mutations occurred, it would mean that just breathing would put one at risk of contracting Ebola. Infections could spread quickly to every part of the globe, as the H1N1 influenza virus did in 2009, after its birth in Mexico.

Sometime soon after, I spoke to a virologist at Columbia University, Dr. Vincent Racaniello, about Osterholm’s statements. I picked out Dr. Racaniello after stumbling on his virology blog (bookmark it, it’s very insightful) which at the time appeared to be one of the few voices of reason advocating caution in the face of the outbreak and pushing against the notion of an airborne Ebola virus with some crucial facts. Below, I reproduce parts of our conversation that address the nature of such facts and how they should guide us.

Note: For the TL;DR version, scroll right to the bottom.

What we know about Ebola based on what we’ve learnt from studying viruses

Some viruses are studied more than others because of their impact on human health. HIV, influenza, the herpes viruses… Herpes viruses infect almost every person on the Earth; influenza infects hundreds of thousands every year; HIV has infected millions and millions of people – so those get most of the attention, so people work on them a lot. Some of the things you find may be generalizable, such as the general need of a virus to get inside of a cell, replicate its genome. But each virus has specifics. Each is very different, the genome is different, the way the genome is encased is different, the way it gets into cells is different, and the ways they spread from person to person are often very different.

For example, if you study transmission of the influenza virus in an animal model, you may learn what controls the transmission of those viruses through the air, but you can’t assume that’s going to be the same for the Ebola virus. So people make the mistake of saying “Because this virus does this, then that virus must do the same thing”. That’s not correct. Unfortunately, it makes it complicated because every virus needs to be studied on its own. We can’t study influenza and hope to prevent Ebola.

How viruses evolve to become deadlier

From what we have seen, if you gain a function, you typically lose something else. When humans impose genetic changes on viruses, they’re doing so from their point of view as opposed to the way it happens in nature, where evolution does the job. When a virus in nature somehow evolves and becomes transmissible in some species, it’s because the virus with the right genome has been selected as opposed to in the lab where a human puts one or two mutations in a virus and gets a phenotype. We don’t know how to achieve gain-of-function in viruses in the lab. We have a lot of hubris, we think we can do anything with viruses. We introduce an amino acid change but who knows what it’s doing to the virus.

What we’ve observed over the years is that when you introduce changes in the virus in the laboratory to get a new property that you want, you lose something else. In terms of transmission, there haven’t been that many transmission experiments done with viruses to understand what controls transmission. H5N1 – avian influenza – ferrets is really the only one – and there, the gain of aerosol transmission caused the loss of virulence. It’s probably because you need other changes to compensate what you’ve done but we’re only looking at transmission.

In nature, perhaps that would be taken care of, so that’s why I say when you change something in a virus you lose something else. But this is not to say that this is always going to be the case. You can’t predict in viruses – you can’t predict in science, often – what’s going to happen. But what we can do is use what we know and use that to inform our thinking. For example, in nature, influenza viruses are very nicely transmitted, but they’re not all that virulent. They don’t have a 90% case-fatality ratio like Ebola, so I think there’s something there that tells us that aerosol transmission is a difficult thing to achieve. But we don’t know what will happen.

An Ebola virus virion.

An Ebola virus virion. Image: CDC/Wikimedia Commons

About what other evolutionary pathways Ebola has at its disposal

Viruses can be transmitted in a number of ways. They can be transmitted through the air, they can be transmitted by close contact of various sorts, they can be transmitted by body fluids, they can be transmitted by sexual contact, intravenous drug use, mother to child during birth, they can be transmitted by insect vectors, and of course some can be transmitted in our DNA – 8% of our genome is a virus. We have never seen a human virus change the way it’s transmitted. Once a virus has already been in people, we have never seen it change.

We’ve been studying viruses for just over a 100 years which is admittedly not a long time – viruses have probably been around since the beginning of the Earth, billions of years – but we go based on what we know, and we’ve never seen a virus change it’s mode of transmission. I’m not particularly worried about Ebola changing its routes of transmission. Right now, it’s spreading by close contact from person to person via body fluids and I think it’s going to stay that way. I don’t think we need to worry about it being picked up by a mosquito for example – that’s very difficult to do because then the virus would have to replicate in the mosquito and that’s a big challenge. And who knows, if it acquired that, what other property would be compromised.

What, according to Dr. Racaniello, we need to focus on

I think we need to really bear down on stopping transmission. It can be done, it’s not going to be easy, but it’s going to require other countries helping out because these West African countries can’t do it themselves. They don’t have a lot of resources and they’re losing a lot of their healthcare people from the epidemic itself. I don’t see what worrying about aerosol transmission would do. I don’t see it changing the way we treat the outbreak at all. I think right now we need to get vaccines and antivirals approved, so that we can get in there and use them. In the meantime, we need to try and interrupt transmission. In past outbreaks, interrupting it has been the way to stop the outbreaks. Admittedly, they’ve been a lot smaller, easier to contain. But SARS infected 10,000 people globally and it was contained by very stringent measures. That was a virus that did transmit by aerosol. So it can be done – it’s just a matter of getting everyone cooperating to do it.

If a virus can become more transmissible after infecting a human population

If you saw the movie ‘Contagion’ – in this movie, the virus mutated and increased its reproductive index, which I thought was one of the weaknesses of the movie. We’ve never seen that happen in nature, which is not to say that it hasn’t. When a virus starts circulating in people, it has everything it needs to circulate effectively. Often, people will bring the 1918 influenza virus which seemed to get more virulent as the outbreak continued but back then we hadn’t even isolated the influenza virus. It wasn’t isolated until 1933. So there’s just no way we can make definitive statements about what did or didn’t happen, but people speculate all the time.

I wish we could go back in time and sample all the viruses that have been out there but we’re going to have to see it happen. For that same reason, no virus has ever changed its transmission route in people. If it had, we could have taken the virus before and after the change and sequence it and say, “Aha! This is what’s important for this kind of transmission!” We don’t have that information so we depend on animals for this.

TL;DR:

  • We can’t study influenza and hope to prevent Ebola.
  • When you introduce changes in the virus in the laboratory to get a new property that you want, you lose something else.
  • In nature, influenza viruses are very nicely transmitted, but they’re not all that virulent. I think there’s something there that tells us that aerosol transmission is a difficult thing to achieve.
  • No virus has ever changed its transmission route in people.
  • SARS infected 10,000 people globally and it was contained by very stringent measures. That was a virus that did transmit by aerosol. So it can be done – it’s just a matter of getting everyone cooperating to do it.
Posted in Psych of Science | Tagged , , , , , , , | Leave a comment

Europa’s ice shell could be quaking

Even before astronomers noticed last year that Europa was spouting jets of water vapor from its icy surface, they thought there was something shifty about Jupiter’s moon. While the 66 other Jovian moons are pitted with craters, Europa sports some unusual blemishes: an abundant crisscrossing of ridges tens of kilometres long. Many are abruptly interrupted by smooth ice patches.

Two geologists think they can explain why. Backed by photos taken by the Galileo space probe, they suggest Europa’s thick shell isn’t continuous but is made up of distinct plates of ice. These plates move away from each other in some places, exposing gaps which are then filled by deeper ice rising upward. In other places they slide over each other and push surface ice downward and form ridges.

“We knew that stuff has been moving over the surface, and up from beneath and breaking through, but we weren’t able to figure where all the older stuff was going,” said study coauthor Dr. Louise Prockter, a planetary scientist at Johns Hopkins. “We’ve found for the first time evidence that material is going back into the interior.” The study was published last month in Nature Geoscience.

On Earth, this kind of tectonic activity replenishes compounds necessary for life, such as carbon dioxide, by letting them move up from the interior through fissures to the surface. Now, scientists say a similar mechanism could apply to Europa. Astronomers think the moon harbors a subsurface ocean of liquid water that feeds the vapor plumes, and could be habitable.

“It’s certainly significant to find another solid body in the solar system that undergoes some kind of surface recycling,” said Peter Driscoll, a planetary scientist at the University of Washington who was not involved in the study.

Prockter, together with Simon Kattenhorn, a geologist at the University of Idaho, Moscow, worked with photographs of a part of Europa’s surface covering 20,000 km2. The pictures were shot by Galileo when it orbited Jupiter from 1995 to 2003.

“We go in using something like Photoshop and start cutting the image up,” Dr. Prockter explained. They then pieced them back together so that the crisscrossing ridges lined up end-to-end, and compared what they had to the surface as it is today.

“Once we started doing the reconstruction, we ended up with a big gap right in the middle,” she said.

The researchers concluded the missing bit had dived beneath another plate.

Although only some of Galileo’s photographs were at a resolution high enough to be useful for the study, Dr. Prockter said it was unlikely that their finding was a one-off because signs of displacement were visible all over Europa’s surface.

Nevertheless, Dr. Driscoll cautioned against using Earth’s tectonic activity as a model for Europa’s. “There are a number of missing features” that define tectonics on Earth, he said, such as arc volcanos and continents. “And many of the properties of Earth’s features may not be expected for an icy shell like Europa, where the materials are extremely different.”

A better gauge of these disparities might be a probe to the Jovian moon that NASA has planned for the mid-2020s.

“I think the timing right now is very important,” said Candice Hansen, a member of NASA’s Planetary Science Subcommittee. She says the Europa study will help scientists working on the probe secure the requisite funding and commitment from Congress.

“I am very enthusiastic about a mission to Europa, and this exciting result is one more reason to go,” she said.

Artist's concept of the Europa Clipper mission investigating Jupiter's icy moon Europa.

Artist’s concept of the Europa Clipper mission investigating Jupiter’s icy moon Europa. Image credit: NASA/JPL-Caltech

Posted in Science & Technology | Tagged , , , , | Leave a comment

Why you should care about the mass of the top quark

In a paper published in Physical Review Letters on July 17, 2014, a team of American researchers reported the most precisely measured value yet of the mass of the top quark, the heaviest fundamental particle. Its mass is so high that can exist only in very high energy environments – such as inside powerful particle colliders or in the very-early universe – and not anywhere else.

For this, the American team’s efforts to measure its mass come across as needlessly painstaking. However, there’s an important reason to get as close to the exact value as possible.

That reason is 2012’s possibly most famous discovery. It was drinks-all-round for the particle physics community when the Higgs boson was discovered by the ATLAS and CMS experiments on the Large Hadron Collider (LHC). While the elation lasted awhile, there were already serious questions being asked about some of the boson’s properties. For one, it was much lighter than is anticipated by some promising areas of theoretical particle physics. Proponents of an idea called naturalness pegged it to be 19 orders of magnitude higher!

Because the Higgs boson is the particulate residue of an omnipresent energy field called the Higgs field, the boson’s mass has implications for how the universe should be. Being much lighter, physicists couldn’t explain why the boson didn’t predicate a universe the size of a football – while their calculations did.

In the second week of September 2014, Stephen Hawking said the Higgs boson will cause the end of the universe as we know it. Because it was Hawking who said and because his statement contained the clause “end of the universe”, the media hype was ridiculous yet to be expected. What he actually meant was that the ‘unnatural’ Higgs mass had placed the universe in a difficult position.

The universe would ideally love to be in its lowest energy state, like you do when you’ve just collapsed into a beanbag with beer, popcorn and Netflix. However, the mass of the Higgs has trapped it on a chair instead. While the universe would still like to be in the lower-energy beanbag, it’s reluctant to get up from the higher-energy yet still comfortable chair.

Someday, according to Hawking, the universe might increase in energy (get out of the chair) and then collapsed into its lowest energy state (the beanbag). And that day is trillions of years away.

What does the mass of the top quark have to do with all this? Quite a bit, it turns out. Fundamental particles like the top quark possess their mass in the form of potential energy. They acquire this energy when they move through the Higgs field, which is spread throughout the universe. Some particles acquire more energy than others. How much energy is acquired depends on two parameters: the strength of the Higgs field (which is constant), and the particle’s Higgs charge.

The Higgs charge determines how strongly a particle engages with the Higgs field. It’s the highest for the top quark, which is why it’s also the heaviest fundamental particle. More relevant for our discussion, this unique connection between the top quark and the Higgs boson is also what makes the top quark an important focus of studies.

Getting the mass of the top quark just right is important to better determining its Higgs charge, ergo the extent of its coupling with the Higgs boson, ergo better determining the properties of the Higgs boson. Small deviations in the value of the top quark’s mass could spell drastic changes in when or how our universe will switch from the chair to the beanbag.

If it does, all our natural laws would change. Life would become impossible.

The American team that made the measurements of the top quark used values obtained from the D0 experiment on the Tevatron particle collider, at the Fermi National Accelerator Laboratory. The Tevatron was shut in 2011, so their measurements are the collider’s last words on top quark mass: 174.98 ± 0.76 GeV/c2 (the Higgs boson weighs around 126 GeV/c2; a gold atom, considered pretty heavy, weighs around 210 GeV/c2). This is a precision of better than 0.5%, the finest yet. This value is likely to be updated once the LHC restarts early next year.

Featured image: Screenshot from Inception
Posted in Science & Technology | Tagged , , , , , , , | Leave a comment

A standout technology prize

The Nobel Prize award ceremony, Stockholm, 2007.

The Nobel Prize award ceremony, Stockholm, 2007. Image: nobelprize.org

Once a year, the Nobel Prize in physics triggers a burst of science news coverage in the media, giving some decades-old invention or discovery more than its 15 minutes’ due on a channel, paper or portal that might have otherwise never bothered about it. Despite its abundant quirks, the prize, the consequent celebration and the subsequent snubs do make for good news.

But this year’s prize may have been a little different. It was awarded to the inventors of the blue-light-emitting diodes (blue LEDs). LEDs that emit the two other primary colors, green and red, were easier to produce. The higher frequency blue emitter proved to be the stumbling block before this year’s Japanese and American Laureates succeeded in the late 1980s. By combining the three colors, the white LED emerged and became the device to, as the Nobel Prize Committee is only too happy to proclaim, power the 21st century.

The reason it’s different is because it draws attention to an arguably understated engineering development. The Nobel Prize Committee has not had any a perceptible bias toward or against engineering, specifically materials science. The 2000, 2001, 2003, 2007, 2009 and 2010 physics prizes, to choose from the last decade, lauded accomplishments in engineering/materials science. However, unlike this year’s recipient, those accomplishments became very popular and entered mainstream public consciousness by the time their significance had been recognized for a Nobel Prize. In fact, that has been the attitude of most prize-winning discoveries: scientifically significant as well as being novelty heavyweights.

In contrast, this year’s prize was more for the achievement of synthesizing gallium-nitride (GaN), the compound semiconductor at the heart of blue LEDs and whose success story hasn’t quite been one for the romantic science books. It’s possible that it could be the blue LED – or LEDs for that matter – didn’t need romanticization, that the pursuit for it has already been justified to the common man by giving him a cheap, “energy-saving” light-bulb. It could be its technology was so sought-after that it was only too successful in transcending the boundaries between discovery and mass utilization.

Since it was first awarded in 1901, the Nobel Prize in physics has recognized 114 discoveries (75%) and but only 39 inventions (25%; including blue LEDs)*. On the other hand, this century has seen a higher incidence of inventors among Laureates as well as recognizes more and more recent inventions. The blue LEDs (2014) emerged in the late 1980s; Wineland and Haroche’s particle-manipulators (2012) were used in the late 1980s; graphene (2010) was first produced in 2003; the CCD sensor (2009) was invented in 1969; the frequency comb (2005) was perfected in the 1990s; the achievement of Bose-Einstein condensates (2001) was in 1995; and so forth.

This may well be the Nobel Prize in physics Committee’s way of acknowledging the dominance of technology. It could also be our window to understanding how award-winning science of the previous century is shaping the award-winning technology of the last three decades.


*Determined based on Nobel Prize citations. For Laureates whose citations were ambiguous, such as “contributed to the development of”, etc., the nature of work was assumed to be both an invention as well as a discovery.
Posted in Science & Technology | Leave a comment

Ello! I love you, let me jump in your game!

This is a guest post contributed by Anuj Srivas. Formerly a tech. reporter and writer for The Hindu, he’s now pursuing an MSc. at the Oxford Internet Institute, and blogging for Sciblogger.

If there were ever an artifact to which Marshall McLuhan’s ‘the medium is the message’ would be best applicable, it would be Ello. The rapidly-growing social network – much like the EU’s ‘right to be forgotten’ – is quickly turning out to be something of a Rorschach test: people look at it and see what they wish to see.

Like all political slogans, Ello’s manifest is becoming an inkblot onto which we can project our innermost ideologies. It is almost instructive to look at the wide range of reactions, if only for the fact that it tells us something about the way in which we will build the future of the Web.

Optimists and advocates of privacy take a look at Ello and see the start of something new, or view it as a chance to refresh the targeted-advertising foundations of our Web. The most sceptical of this lot, however, point towards the company’s venture capital funding and sneer.

Technology and business analysts look at Ello and see a failed business model; one that is doomed from the start. Feminists and other minority activists look at the company’s founders and notice the appalling lack of diversity. Utopian Internet intellectuals like Clay Shirky see Ello as a way to reclaim conversational discourse on the Internet, even if it doesn’t quite achieve it just yet.

What do I see in the Ello inkblot? Two things.

The first is that Ello, if it gains enough traction, will become an example of whether the free market is capable of providing a social network alternative that respects privacy.

For the last decade, one of the biggest debates among netizens has been whether we should take steps (legal or otherwise) to safeguard values such as privacy on the Internet. One of the most vocal arguments against this has been that “if the demand for the privacy is so great, then the market will notice the demand and find some way to supply it”.

Ello is seemingly the first proper, privacy-caring, centralized social network that the market has spit out (Diaspora was more of a social creation that was designed to radically change online social networks, which was in all likelihood what caused its stagnation). In this way, the VC funding gives Ello a greater chance to provide a better experience – even if it does prove to be the spark that leads to the company’s demise.

If Ello succeeds and continues to stick to its espoused principles, then that’s one argument settled.

The second point pertains to all that Ello does not represent. Sociologist Nathan Jurgensen has an excellent post on Ello where he lashes out at how online social networks are still being built by only technology geeks. He writes:

This [Ello] is yet another example of social media built by coders and entrepreneurs, but no central role for those expert in thinking about and researching the social world. The people who have decided they should mediate our social interactions and write a political manifesto have no special expertise in the social or political.

I cannot emphasize this point enough. One of the more prominent theories regarding technology and its implications is the ‘social shaping of technology’. It theorizes that technology is not born and developed in a vacuum – it is instead very much shaped and created by relevant social groups. There is little doubt that much of today’s technology and online services is skewed very disproportionately – the number of social groups that are involved in the creation of an online social network is minuscule compared to the potential reach and influence of the final product. Ello is no different when it comes to this.

It is a combination of these two points that sums up the current, almost tragic state of affairs. The technology and digital tools of today are very rarely created, or deployed, keeping in mind the needs of the citizen. They usually are brought to life from some entrepreneur’s or venture capitalist’s PowerPoint presentation and then applied to real world situations.

Is Ello the anti-Facebook that we need? Perhaps. Is it the one we deserve? Probably not.

Posted in Internet & Digital Media | Tagged , , , , ,