Thursday, 23 July 2015

In Tests, Yahoo Uses Google to Power Search Results and Ads


Yahoo
Presently, Yahoo is already in the deal of 10 years to use Microsoft's Bing to power its search results and due to that this software giant is not happy as Google is entering in its territory.

To work together, Yahoo and Google is testing:

According to the reports of New York Times, Yahoo and Google are entering into partnership to test the Google powered search results and search ads for Yahoo. However; the discovery was first time reported by Aaron Wall of, owner of SEO Book (One of the best company in industry). The more he added that this arrangement is only for the test and may be in future it will not convert into actual deal. According to Yahoo spokeswoman, “We are working together to provide the best experiences to our users and it’s our regular work to run small test or pilot projects with different partners to power up the search results for users benefit. The more she added that at this time we don’t have anything more to share.

Microsoft and Google didn’t respond immediately:

Google is the leading and its most popular search engine as it has more than 71 per cent market share in all over the world, as per the reports of NetMarketShare. However; in this list Yahoo lead by Marissa Mayer, former Google executive is at the third position with 9.6 per cent share in all over the world and Microsoft powered Bing search engine is at second position with 10 per cent market share. Only in US Google has more than 64.40 per cent share in search market, followed by Bing with 20.20 per cent and Yahoo with 12.70 per cent market share in search industry, as per the latest reports of ComScore.

Marissa Mayer, CEO of Yahoo have seen the same search results in her career earlier when she was in Google and due to that she signaled that now our company is focusing on powering up the Yahoo search engines to cover more market share. But any search deal between Yahoo and Google will definitely come under antitrust scrutiny because European Commission has already antitrust pending cases about Google’s dominant position. However; earlier in April 2015, Yahoo and Microsoft have announced the agreement, which will provide flexibility to yahoo to enhance the search experience of users across multiple devices such as; mobile, tablets, and desktops. Whereas; the key component of agreement was that yahoo is free to get in partnership with other companies to power up its own search platforms.

In past, Microsoft has lost few important deals including recent deal with Facebook as in December 2014, Facebook announced that they have ended the deal with Bing to provide web search results through Bing because now we are focusing on helping the users to find what people are sharing on their walls through our Facebook search box. According to tech experts, its clear indication that Google is trying to enter in the Microsoft's territory.

Wednesday, 22 July 2015

Team Leads Google Expedition to Create 'Internet of Things' Technology


Internet_things
Google had given the first and fore most major Internet of Things gadget in the form of S Watch just a few years back. Now it is all set to bring great revolution in this segment by offering funding support to the unique expeditions by a team of different universities in creating a new robust platform for Internet of Things devices.

Under this great initiative, Carnegie Mellon University will be turning its university into a living laboratory. The aim of this expedition is to create a new and more novel robust platform, which will greatly improve the communication between the internet-connected devices and people.

Projects aims at improving the connectivity issues

The aim of this project is to radically enhance the human to human as well as human to computer interaction while bringing in a large scale deployment of Internet of things technology. There is also a focus upon ensuring privacy as well as improvement of features over time, which will enable the users to design application as per their needs and requirements for daily usage.

Carnegie Mellon University is at the helm of this project and it will be working in close with other prominent universities namely Cornell, Illinois, Standford and Google to create a new platform called GIoTTO to support IoT applications.

The initial plan is to focus on sensors, which are inexpensive in nature and offer easier deployment in GIoTTO. New middlewares will be developed to facilitate app development along with addressing privacy concerns by efficiently managing privacy and security issues. Most importantly new tools will be made available to the end users in order to enable them to develop their own IoT experiences.

Google funding this project to create a more consistent platform

Google believes in bringing collaboration of best of minds and ideas spread across universities in order to accelerate the innovation and IoT adoption on larger scale. Google will be initially providing $500,000 to Carnegie Mellon University for launching the expedition. The privacy and security related aspects will be taken by second CMU team, which will be led by Computer Science Professor Norman Sadeh.

Sadeh has stated that he will be demonstrating the use of various personalized privacy assistants, which will help users in configuring the privacy settings in order to retain control over their data.

The future prospects of this expedition

Internet of Things brings in a lot of promises which even appear to be futuristic at the moment but their wide scale adoption and deployment will bring a whole new level of experience for the end users. Sensors embedded in the building and everyday objects can eventually help in creating smart environments. CMU researchers had created Snap2It, it is a very unique system which allows the users to link the printer or projector by simply taking a smartphone photo of it.

Google’s aim here is to fulfill the IoT’s promise and potential by creating such complete system of interoperable IoT technology in a better manner through preserving privacy and ensuring security at the same time.

Police Deleting Thousands Of Facebook Posts In 'Operation Jasper' Piracy Crackdown


Facebook
Enforcement Officers Taken Down Facebook Listings on Suspected FB Sellers

According to National Trading Standards, Enforcement officers have raided 12 locations over the past few weeks and are still involved in another 22 on-going investigations aiming at criminals who have exploited social media channels in order to sell dangerous and counterfeit goods, committing copyright theft, which is the biggest attempt of its kind.

The officers have taken down around 4,300 Facebook listings together with 20 profiles, issuing over 200 warning letters and delivered 24 cases, with letters to the homes of those suspected Facebook sellers. Some of the dangerous or deadlygoods seized were Android TV boxes together with unsafe mains chargers as well as hundreds of counterfeit Cinderella dolls comprising of high level of toxic phthalates.

In Worcester, two residential properties also contained `a host’ of counterfeit packaged computers, mobile phones, tablets, tracksuits T-shirts and trainers according to Trading Standards. Nick Boles, business minister informed that the `counterfeit and piracy of trademarked and copyrighted materials could harm legitimate businesses, threatening jobs, causing real danger to consumers. Hence strong action was taken to stop these criminals through the Government’s funding of the National Trading Standard e-Crime Team.

Operation Jasper – Important Psychological Blow to Criminals 

Criminals do not tend to act alone and are regularly connected to serious organised crime group and bringing about this awareness was essential to consumers who buy fake products online and the risk faced by them. Lord Toby Harris, National Trading Standards chairman stated that `Jasper has struck an important psychological blow against criminals who believe that they can operate with impunity on social media platforms without getting caught.

 It shows we can track them down, enter their homes, seize their goods and computers and arrest and prosecute them, even if they are operating anonymously online’. Criminals who tend to operate on social media tend to become shameless since they think that operating from their living rooms with laptops, without the need of being physically present on market stalls, would mean that they are less likely in getting caught.Harris is pleased that the operation had been successful in proving that their misconception as wrong and urges consumers to be vigilant and report any suspected online rogue traders to the Citizens Advice consumers’ helpline on 03454040506.

Crackdown by Officers – England/Wales/Northern Ireland 

Operation Jasper operated by the National Trading Standards e-Crime team alongwith the National Markets Groups with members of BPI, Federation against Copyright Theft and the Alliance for Intellectual Property Theft, is manned by officers from the police and government agencies and is apparently one of the largest operations of its kind. It has been focusing on criminals who tend to exploit social media in committing copyright theft and then sell dangerous and counterfeit goods.

The enforcement officers carried out a crackdown all across England, Wales and Northern Ireland against those offering pirate and counterfeit product through Facebook. Due to this occurrence among citizens of the UK, Facebook accounts have increased and have become more than just a place to manage their social lives and for some it is a means of distributing infringing content which had been overlooked by the authorities.

As per the latest government IP Crime Report, social media had become the `channel of choice’ for online pirate activity and for several months in the past, most of the leading torrent sites had problems with their Facebook accounts. The Pirate Bay’s account was closed in December 2014, May and June 2015 and accounts of Extra Torrent as well as RARBG were suspended on grounds of copyright infringement.

Tech Fail - Explaining Today's 3 Big Computer Errors


CNN
Disasters took down United Airlines, the Wall Street Journal website and also halted the New York Stock Exchange - NYSE and it is not the right time to blame hackers. CNNMoney had reached out to several top technologists with all coming up with the same opinion. Established on what is known now, it is very unlikely to be the outcome of a coordinated cyber-attack.

Two unidentified US official informed that there was no sign that the computer problem that affected the NYSE were the ones which grounded the UA flights. According to Reuters, United Airlines informed that an issue with a router was the cause of its outage. He was specifically relating to how a malfunctioning computer network router had left almost 5,000 United Airlines flight all over the world, grounded for over an hour, racking up large costs in fuel, overtime besides other expenses.

He further informed that they are recovering from a network connectivity issue and restoring flight operations. Mr Caldwell, chairman of Caldwell Securities commented that `it’s disruptive though not wildly disruptive’. The halt in trading at 11.30, local time was described as unprecedented in its scale.

Every Failure Could be Different in Nature

The significant is that every failure could be different in nature and hacks usually exploit a single flaw in order to attack a wide variety of entities at the same time. United temporarily had grounded all flights due to a problem in the way its computers coordinated with each other.

NYSE on the other hand suspended trading due to an internal technical issue which involved computers only at the stock exchange. The WSJ.com failed since its computers server was unable to respond quickly enough which could have been overloaded.

All these issues do not seem to be similar. Moreover, tech companies which analyse vast amounts of Web traffic behind the scenes, state that they had not identified a coordinated attack. United flights resumed nationwide after an hour and a source informed CNN that its employee had investigated the computer issue and determined that the company was not attacked and this was not the work of hackers at all. There was no hacking and no evidence of any connection to what was happening at the NYSE.

Automated Software - Complex


NYSE which went back online after four hours also assured the people that hackers were not responsible for the shutdown. NYSE tweeted that the issue they were experiencing was an internal technical problem and was not the result of a cyber-breach. They chose to suspend trading on NYSE in order to avoid problems resulting from technical issue.

WSJ.com was also back by Wednesday afternoon though with a bare bones website together with basic formatting and did not mention the cause of the issue. However some speculate that the shutdown drew a lot of visitors to WSJ.com resulting in overloading the site’s servers. What connects all these failures is that the companies involved are business operations relying extensively on computer systems.

Automated software tends to be complex and at times involves millions of lines of computers code where a single error could misplace text which could result in a halt. Cyber security expert, Joshua Corman commented that an increased dependence on undependable things allow for cascading failures.

Saturday, 18 July 2015

Internet Addresses Have Officially Run Out

Internet
Top Level Exhaustion ….. IPv4 Addresses Allocated for Special Use

When the internet was first developed, it was presumed that around 4 billion unique number combination would be adequate. However, it did not turn out the way it was predicted when tech pioneer Ken Olsen had stated in 1977 that `there is no reason anyone would want a computer in their home’.

With the internet it gave rise to more usage with users getting tech savvy and getting connected to the internet world. Each node of Internet Protocol – IP network like computer, router or a network printer has been assigned an IP address which is used in locating and identifying the node in communication with several other nodes on the internet. An IP address space is handled by the Internet Assigned Numbers Authority – IANA, globally, as well as by the five regional Internet registries – RIR, that are responsible in their respective territories for tasks to end users and local internet registries like internet service providers.

 Top level exhaustion took place on 31, January’2011. From the five RIRs, three have exhausted allocation of the blocks and have not reserved for IPv6 transition which took place on 15th April 2011 for Asia Pacific, while on 14th September 2012 for Europe and for Latin America and the Caribbean on 10th June 2014.Internet Protocol version 4 offers 4,294,967,296, addresses though large blocks of IPv4 addresses have been allocated for special uses and are not provided for public allocation.

ARIN unable to Fulfil Allocation of Large IPv4 Address Block

As per Gartner researchers, he states that there would be around 25 billion internet connected devices by 2020 which is more than six times to what the developers had planned when the net went live in 1983. Vint Cerf, the internet founding father clarifies that they were aware of this coming up and had been reading about the drying blocks of IPv4 addresses and for the first time North America has been out of the new IPv4 addresses.

Presently, Caribbean Islands, Canada, North Atlantic and US will be receiving the waiting list from the American Registry for Internet Numbers and has been cautioned that it will be unable to fulfill the allocation of a large IPv4 address block since the address pool has been drying and because of this the ARIN for the first time will be changing its policies on allocation. Though the infrastructure running the internet was made with space for 4 billion addresses, which had seemed a lot at that point of time, however with provision of too many devices coming up, the IPv4 protocol seems to be running out of space.

Initiated IPv4 Unmet Request Policy

American Registry for Internet Numbers, - ARIN, has now initiated its IPv4 Unmet Request Policy and till now, organizations in the ARIN area were in a position of getting IPv4 addressed whenever needed. However, recently, ARIN is now not in a position of fulfilling the requests resulting in ISP which come to ARIN for IPv4 address space are faced with three choices namely-
  • They could take a smaller block, presently ARIN does have a limited supply of blocks of 512 and 256 addresses
  • They could go on the wait list with the hope that a block of desired size would be available sometime in the near future.
  • They could buy addresses from an organization which may tend to have more than their requirement.
Experts have advised those running websites to use the spacious IPv6 specification, though moving could be expensive as well as time consuming. However, most of the large websites had already gone ahead and done so while several smaller ones could be left without much space to continue working. The IP address version which are now running out are utilised by computers in identifying themselves to each other in order to get connected. The old IP addresses comprised of four numbers with dots between them.
IPv6 Picking up Pace
Although being limited to four numbers meant that only 4 billion addresses were available and there are many more devices intending to get connected to the internet. IPv6 is picking up the pace and ARIN has been encouraging organizations in considering using IPv4 addresses.

Supply of IPv6 addresses is enough and is not likely to run out in future. By adopting a much more complex address, IPv6 would be increasing the minimum amount and it has space for 340 undecillion addresses or 340 followed by 36 zeroes, which is adequate for each atom on Earth to be accommodated with one. Those businesses who have not switched so far could move towards the new specification - IPv6.

Being expensive, companies could move towards hardware which would be compatible with IPv6. Should they decide to move over they could end up buying the limited and probably expensive IPv4 addresses that may be left. If users do not move over to the new system, they would not be able to get on the net since they will not have addresses to use and the internet would stop growing at that point. Experts had warned earlier that there were only 3.4 million addresses left in North America and that they would be running out in summer.