21 January 2014

FCC Won’t Protect Net Neutrality

ISPs may be able to offer faster connections to preferred online portals and even block their rivals, as an American appeals court decided that regulators could no longer enforce the so-called “net neutrality”.

verizon-logo.jpg

Recently, the US court of appeals ruled in favor of Verizon. In the case, FCC regulators argued that forcing ISP to provide equal access to worldwide web and bandwidth to all legitimate material is very important for the open Internet and encourages innovation. However, ISPs believe that net neutrality hampers both their ability to strike commercial deals with content providers and ability to provide higher-speed access to premium content – for example, high-definition movies. The broadband providers argue deregulation could spur more growth in emerging markets.

Despite the commercial impact, the court decision drew immediate concern from free speech groups that worry about the consequences of undermining net neutrality all over the world. They claim that the court decision will adversely affect the daily lives of US citizens and change the open nature of the web.

Some claim that court’s ruling grants commercial entities the right to block traffic and give preferential treatment to websites they like, thus steering users to or away from them. The FCC is going to appeal the decision, which may eventually wind up with the US supreme court.

One of the judges pointed out that regardless of the merits of net neutrality, the FCC simply had no legal power to treat ISPs like traditional telephone operators. Of course, Verizon welcomed this court ruling, but insisted that it would extend consumer choice in the future. The company assured that this move won’t affect users’ ability to access and use the Internet, but rather allow Verizon more room for innovation.

However, net neutrality supporters claim that allowing powerful tech giants to strike deals with ISPs to promote their material over others may hamper the growth of social media. They complain that the ability of the web to spread and share ideas is getting better, so individuals and small groups are now able produce resources that used to be the exclusive domain of large entities.

NEVER LOST AGAIN WITH NEW DIGITAL OBJECT (DO) ARCHITECTURE @~?

Robert E Kahn is considered one of the key Internet pioneers. An engineer and computer scientist, who, along with Vint Cerf, invented the Transmission Control Protocol (TCP) and the Internet Protocol (IP), the fundamental communication protocols at the heart of the Internet.
His latest project is the Digital Object (DO) Architecture. A key feature of the DO Architecture is the unique persistent identifier associated with each digital object. Imagine a large document or blog post with a lot of embedded URLs. After a certain amount of time those URLs will most likely become non-operational. If you replace those URLs with unique persistent digital object identifiers then, if properly administered, the links will never be lost – because the identifier is now associated with a digital object rather than a port on a machine. That’s only part of the story though, DO Architecture is exciting technology, it also provides security features that can, for example, better enable transactions and rights management. Libraries and the film industry are among early adopters of this technology.
ITU talked to Robert E Kahn about his work on DO Architecture and his motivation for bringing it to ITU.
With DO Architecture were you trying to address current challenges or facilitate new ways of doing things or both?
In the late 1980s, my colleague Vint Cerf and I perceived the need to move beyond the rather static methods being used to manage information in the Internet. This led to an effort which we called Knowbot programing, or more generally, mobile programing. We wrote a report – The Digital Library Project, Vol. 1:  The World of Knowbots (March 1988) – that describes the basic components of an open architecture for a digital library system and a plan for its development. Certain information management aspects of this effort, in particular the identifier/resolution component, were later developed to become the basis for the Digital Object (DO) Architecture, an overview of which is available here.
ITU-T recently approved a global standard for the discovery of identity information (Recommendation ITU-T X.1255) that was based on CNRI’s contribution.  What is Recommendation ITU-T X.1255 and why it is important?
With the proliferation of information systems in the Internet that has developed across the world, and with the associated creativity and innovation,  a critical question has arisen: “What are the basic building blocks available to the public that will enable interoperability across such heterogeneous systems?”
ITU-T X.1255 was based on CNRI’s DO Architecture and expanded by ITU-T Study Group 17, the ITU-T group leading security and identity management (IdM) standards work  Discussions in SG17 took the starting point of analysis from the notion of “digital object,” or more abstractly, “digital entity,” defined as an “entity” that is represented as, or converted to, a machine-independent data structure (of one or many elements) that can be parsed by different information systems, with each such digital entity having an associated unique persistent identifier.
These concepts are the basis for the deployment of systems of registries to improve the discovery and accessibility of not just identity-related management information, but information in digital form, more generally. The Digital Entity Data Model, and associated Digital Entity Interface Protocol, also described in ITU-T X.1255 are basic information infrastructure elements that should span technology generations and stand the test of time.
What contribution can the implementation of the DO Architecture in fields such as banking, healthcare and transportation, make towards addressing security and privacy?
Security is a fundamental capability of the DO Architecture, which is not the case for other distributed management systems for information in digital form in the Internet.
The basic administration of the identifier/resolution component of the DO Architecture is based on a public key encryption (PKI) regime. The creator of a digital object (or more abstractly, digital entity) has the ability to restrict access to their objects to known users; people or machines known to the system by their respective identifiers.
In practice, this system allows for a direct correlation between the security measures deployed and the degree of privacy achieved. Think of the medical records doctors keep on their patients. If a record is structured as a digital entity, access to this confidential information can be limited to authorized users, based on their identifiers and their ability to respond accurately to a PKI challenge. In some cases, access may mean permission to obtain a digital entity in its entirety. In other cases, access may mean permission to perform specific operations on all or part of the digital entity.
How will Recommendation ITU-T X.1255 enable communications and transactions between “things”?
There is a tendency to view “things” in the Internet as being identified with respect to their physical manifestation, but specific information about things is more important. In the Internet today, an IP address is associated with “things” such as a port on a machine, typically a user’s computer or a network-based server, or more generally, a device such as a smart phone or a digitally enabled light bulb socket or a refrigerator.
Moving away from identifying information about things to identifying the information itself, represented in digital form, makes it possible to associate this information with other types of information. This ability to link related kinds of information in digital form holds great promise for enabling new ways of doing business in the Internet. ITU-T X.1255 describes metadata registries that are interoperable, and may be federated to ensure the long-term discoverability and utility of information structured as digital entities with resolvable persistent identifiers that endure over time.
What next for the DO Architecture?
The basic structure of the DO Architecture is applicable to information management needs of all kinds, but its development over the coming years will likely see the creation of multiple metadata schemas for different domains. We can expect the ability to search DO Registries to benefit from advances in search technology. Keyword search is still a primary technique, but other techniques including image understanding, speech analysis and pattern matching in large data sets will prove very useful.
In 2014, the DO Architecture will reach a significant juncture with a change in the administration of one of its key components, the Global Handle Registry (GHR). CNRI has maintained control over the administration of the GHR since it was first made available in the Internet by CNRI in 1994.
Plans are now well underway to transfer overall administration of the GHR to the DONA Foundation, a non-profit organization to be based in Geneva. The Foundation, once established, will be responsible for determining the set of system administrators, for digitally signing critical system information, and for establishing the overall policies and procedures governing the GHR’s operation. Multiple independent parties, which are authorized and credentialed by the Foundation, will be responsible for the distributed operation of the GHR.
Robert_KahnRobert E. Kahn is Chairman, CEO and President of the Corporation for National Research Initiatives (CNRI), which he founded in 1986 after spending thirteen years at the U.S. Defense Advanced Research Projects Agency (DARPA). Dr. Kahn was responsible for the system design of ARPANET, the first packet-switched network. He is a co-inventor of the TCP/IP protocols and was responsible for originating DARPA’s Internet Program. In his recent work, Dr. Kahn has been developing the concept of the Digital Object Architecture, which provides a framework for interoperability across heterogeneous information systems. After receiving a B.E.E. from the City College of New York in 1960, Dr. Kahn earned M.A. and Ph.D. degrees from Princeton University in 1962 and 1964 respectively. He is a recipient of the 1997 National Medal of Technology, the ACM Turing award, the 2004 Presidential Medal of Freedom, the Japan Prize in 2008 and the Queen Elizabeth Prize for Engineering in 2013.

15 January 2014

Samsung and Apple Ready for Mediation

Mass media revealed that the companies have agreed to attend a mediation session in February before meeting in court in March. Apple and Samsung CEOs have agreed to attend the session with in-house attorneys only, after their legal teams had met in the beginning of January to discuss settlement options. The companies didn’t reveal the details of the meeting, but it looks like they really want to end the silly court actions started by Steve Jobs.

apple-vs-samsung.jpeg

Apple was going to patent troll Samsung out of the market by any means, being insulted that Samsung had stolen Apple’s idea which the latter stole from Nokia for a touch screen smartphone. This battle involved many court actions all over the world that has ultimately failed to stop either Apple or Samsung producing their devices. The funniest part is that Apple’s only real win owed to a patent of inventing the rounded rectangle, something that is being appealed. Within the last 2 years, the parties have gone to trial twice, and US juries have awarded Apple a total of about $930 million, but European courts didn’t favor Apple.

The industry experts admit that although two high profile leaders meeting does bode well, they don’t believe the companies will come to an arrangement. The situation has reached a point where they have been fighting so long that they might have already forgotten why. Apparently, the only winners in this war are corporate legal teams, who have no reason to stop it.

French & Arabs Upset with US Security Obsession

An oncoming deal between the United Arab Emirates to purchase a couple of intelligence satellites from France worth $930 million seems to be in trouble after the National Security Agency tried to put backdoors into the technology. Two military observation satellites contained a couple of specific US-supplied components providing backdoors to the highly secure information transmitted to the ground station.

The Arabs have asked the French to come up with some components that won’t leak their plans to the US, saying that they would rather prefer Russian or Chinese firms to take over the project. Actually, only the US believes that Chinese and Russians are spying on people.

So, the United Arab Emirates is ready to scrap the whole deal. Apparently, the country is hugely miffed that it bought French and found itself spied on by the US. In reality, the French only won the deal because the American State Department had been such an arse about how the system could be used.

In the meantime, the UAE likes Russian technology a lot - for example, it used the GLONASS space-based navigation system fitted as a redundancy feature on a Western European weapon system. As for France, defence experts can’t find out why the French were using the American technology in the first place.

Media reports say that France operates the Pleiades spy satellite in a kind of a critical piece of the country's sovereignty. Taking into account that core competence, it seemed weird that France would use American technology, though there’s an agreement between Paris and Washington over transfer of capabilities.

Finally, the deal is also problematic because Israel may want to limit the ability of the system to work. The matter is that the French satellites sold to the Arabs, a very high optical resolution and encrypted code may be used to guide a cruise missile to a target in Iran.

19 December 2013

NEW STANDARD ON INTER-CLOUD COMPUTING (Source_ITU@WebSite)

Cloud computing experts have reached first-stage approval (‘consent’) on a standardized framework for inter-cloud computing, an architecture whereby cloud service providers (CSPs) benefit from the services or resources of partnering CSPs to satisfy customer needs as dynamically as possible.
Recommendation ITU-T Y.3511 “Framework of inter-cloud computing for network and infrastructure” describes the framework for the interaction of multiple CSPs that might underlie the fulfillment of a single CSP’s service contracts with its customers.
The standard describes the possible relationship patterns among multiple CSPs – namely ‘peering’, ‘federation’ and ‘intermediary’ – based on several inter-cloud computing use cases and the consideration on different types of service offerings. It continues to introduce the concept of ‘primary’ and ‘secondary’ CSPs; the primary being the CSP required to fulfill a service contract with a customer, and the secondary the most immediate CSP that interworks its services and resources with other partnering CSPs to aid the primary CSP in its delivery of services. Building on these concepts, the interaction of CSPs in federation and intermediary patterns is discussed in-depth and Y.3511 concludes with the derivation of functional requirements for inter-cloud computing.
Although not integral parts of the Recommendation, Y.3511 also provides three annexes which detail “Use cases from the inter-cloud perspective”, “Use cases from telecom and non-telecom providers’ views”, and “Abstract service offering models for inter-cloud computing”.
Nine other standards found consent at SG13’s meeting:
  • ITU-T Y.1903 “Functional requirements of mobile IPTV”
  • ITU-T Y.2254 “Capabilities of multi-connection to support enhanced Multimedia Telephony (eMMTel) services”
  • ITU-T Y.2253 “Capabilities of Multi-connection to Support Streaming Service”
  • ITU-T Y.3032 “Configurations of node identifiers and their mapping with locators in future networks”
  • ITU-T Y.3045 “Smart ubiquitous networks – Functional architecture of content delivery”
  • ITU-T Y.3033 “Framework of Data Aware Networking for Future Networks”
  • ITU-T Y.2065 “Service and capability requirements for e-health monitoring services”
  • ITU-T Y.2064 “Energy saving using smart objects in home networks”

Apple and Google App Stores Are Vulnerable

Too many apps in Apple and Google app stores have been targeted for hacking. The security experts point out that financial apps on Android are the most vulnerable. In most cases, applications have been hacked and uploaded to 3rd-party stores or Google Play in a bid to capture credentials from consumers, or to operate maliciously, or to defraud the app’s creator by removing adware elements.
1409c84d-cb2c-44fd-9ebc-057f9a90509b-460x276.jpeg

Security experts admit that hacked apps are showing up in various storefronts, like Cydia, in a decrypted state, so by definition the software has been hacked. The specialists have seen multiple examples where there has been some tampering with the original code. In the meantime, financial apps are a particular concern, because people trust them with sensitive data, like bank account numbers and passwords. It was found that 23% of sample iOS financial apps had been hacked and reposted, as well as 53% of Android financial apps.

As you know, Android users are able to download apps from 3rd-party stores through setting on their devices, while iOS users have to “jailbreak” their device to do so. In other words, they voluntarily use a hacking attack to give themselves the equivalent of “root” privileges for installing software. Thus far, iOS 7 hasn’t been jailbroken.

However, even Google’s official Play store itself can be a source of malware and hacked applications. A few months ago BlackBerry had to halt the rollout of its BBM app for Android because a hacked version appeared in the Play store before the official one and had been downloaded over a million times. The experts also warn that it’s easy for people to upload a “Bank of America” app onto Google Play and use freely available data about the bank, while fooling users. It is believed that Google Play isn’t a vetted app store, having a lot of cruft, while in the Apple Store users are almost certain to see only legitimate apps. So, hacked code is not a significant problem in Apple’s App Store, as the company vets all apps before uploading them onto its App Store. As for Google, it will remove apps only after the complaints emerge or if they are detected as having malware. Both Google and Apple platforms have a “kill switch” that is able to retrospectively delete malicious installed apps from the devices.

Bitcoin Is Not a Currency of a Future

An expert on digital currencies from Ernst & Young claimed that Bitcoin doesn’t have to replace normal currency to have a future. The expert described a number of myths around the currency, one of which was its position as a replacement for “fiat” money.

bitcoins-.png


Fiat currency is essentially currency the government decrees to be legal tender. And Bitcoin was obviously not created as a replacement for fiat currency. There are many people talking about how Bitcoin is going to take over, or how it doesn’t have the properties lending to it being used widely. The currency was really created to be used in electronic commerce and for micro transactions. If you remember this, the future risks for the currency take on a different shape. At the moment, many experts are concerned with such problems as price volatility and the deflationary nature of Bitcoin. Since there will only ever be 21 million Bitcoins, it has led some to fear that the currency will have a “deflationary” element, leading to each unit getting more valuable over time.

Deflation is usually blamed for the “lost decade” in Japan, and Ernst & Young emphasized that it adheres to the mainstream economic view that mild positive inflation is healthy for a national currency. However, when talking about using Bitcoin as an ecommerce tool, deflation seems to be not necessarily problematic.

Instead, the experts highlighted problems of speed and fraud control as the most pressing priorities for the virtual currency. To prevent fraud, the Bitcoin network has to “confirm” transactions every ten minutes. This is one of the weaknesses with Bitcoin – the problem is that you generally have to wait for 5 to 6 transaction confirmations before making sure that your money hasn’t been spent twice, which can take up to 40 or 50 minutes.

Some businesses have decided that speed is worth the risk. One pub in London takes Bitcoin and accepts unconfirmed transactions as payment, for example. However, not every retailer is able to that – especially if they are selling goods more expensive than beer.

Ernst & Young believes that there are definite possible gains in Bitcoin, in terms of lowered transaction costs. On the other side, there are also some significant negatives, in terms of accountability and how to deal with anonymous users and how to regulate in the market.

YouTube Advertising Revenues Estimated to Grow 50% in 2013

Google has never revealed the scale of profits YouTube makes since acquiring the video streaming service for $1.65 billion seven years ago. However, the analysts and researchers still can take guesses. The latest estimates were made by eMarketer – it predicts that the gross ad revenues of the service will increase over 50% to $5.6 billion this year, which is more than 10% of Google’s total revenue.

youtube-give-away-50-million-free-advertising-new-video-program.jpg

Even after the company has paid ad partners and video creators their percentage, its net ad revenues are still predicted to reach almost $2 billion in 2013, up 65% compared to last year’s $1.18 billion. The researchers have also broken out YouTube’s net ad revenues in the United States, estimating that the figure will reach $1.08 billion, $850 million of it coming from video ads. Thus, YouTube gets 1/5 of all US video advertising revenues for 2013.

Of course, all these estimations are all guesswork, but eMarketer claims that it is based on “hundreds of datapoints and studies about YouTube revenues, ad impressions, rates, usage and other information received from research companies, investment banks, Google reports and interviews with industry executives.

Those interested can compare eMarketer’s analysis to other numbers: for example, in May 2013, Morgan Stanley predicted that the company’s gross revenues would reach $4 billion this year, while Barclays thought it would be $3.6 billion. Recent report by analyst firm Wedge Partners also suggested that YouTube accounts for about 10% of Google’s revenues (which corresponds with eMarketer’s analysis), which if the Google’s 4th quarter matched the average revenues across the previous quarters would result in approximately $5.7 billion of YouTube revenues for the year as a whole.

Still, it all remains guesswork, and Google is very unlikely to announce the real figures anytime soon. It is worth noting that the $1.65bn Google paid for YouTube seven years ago (a sum which shocked many people at the time) looks like something of a bargain today.

The company’s public statistics for YouTube reveal that the service attracts 1bn people watching over 6bn hours of video per month, and 80% of its traffic is coming from abroad. 40% of its viewing time is consumed by mobile devices.

2/3rd of Web Traffic is Bots

The security outfit Incapsula has found out that about 62% of all website traffic today is generated by bots. There was a 21% rise on 2012 figure where bots accounted for a bit over 50% of the traffic.
internet-bot-tarkvaralised-robotid-300x260.jpg


Of course, some of those automated software instruments are malicious, but the rapid growth in traffic was for good bots used by search engines to crawl sites to index their content. Other types of bots are employed by analytics companies to provide feedback about how a portal performs, or by others to carry out specific tasks like helping the online archive preserve material before it’s removed.

The security company observed almost 1.5 billion bot visits within a 3-month period from the 20,000 websites operated by its customers. Regardless of the overall growth in bot activity, the company pointed out that many of the traditional malicious uses of the tools are now less common. There had been a 75% drop in the frequency spam links were being automatically posted.

In addition, it had seen a 10% drop in hacking tool bot activities. Those include the use of code to distribute malware, steal credit cards and hijack and deface sites. Another new trend was an 8% growth in the use of so-called “impersonator bots”. This classification includes software which masquerades as being from a search engine or other legal agent and manages to fool security measures. Such bots are custom-made to carry out a specific activity like a DDoS attack, forcing a server to crash taking a site or service offline by flooding it with traffic or to steal corporate secrets.

The developing good bots show that the legitimate services were sampling the net more frequently, which can allow search engines to add breaking news stories to its results quicker, for instance.

01 December 2013

Internet Cafés Disappear

Internet cafes, once being the communication hub in developing countries, are fast disappearing from our life. The reason is obvious – the rise in smartphones is making the need to go into a café largely redundant.
InternetCafe.jpg

For example, in Rwanda one Internet café went from 200 daily customers to just 10. India is suffering as well – for instance, some businesses in the southern city of Mysore have opted to sell stationery or sweets instead of Internet access. In the meantime, Internet café owners have to diversify their offerings in order to include flight bookings, mobile phone top-up cards, and accessories for different gadgets. Even cafés in Myanmar, where mobile penetration is very low, are facing the same trend there.

However, more developed countries had seen cafés survive to cater for immersive Internet gaming. At the same time, the number of such cafes in South Korea dropped to 15,800 in 2012 from 19,000 in 2010. As for China, the number of online cafes there dropped 7% to 136,000 in 2012 from 2011.

The above mentioned statistics flies in the face of a 5-year study released by the University of Washington in July, which discovered that Internet users in developing countries still rely on such public venues as cafes and libraries for Internet access even when smartphones are available. The research insisted that one technology won’t replace the other and smartphones are not responsible for the current trend.