Thursday, 30 April 2015

Google’s Project Fi

Project_Fi
One of the most exciting information that was revealed on Wednesday has to be from Google. The confirmed announced their plan for launching their wireless service called Project Fi. The information highlighted that this service will be leveraging the obtainable Wi-Fi networks and only switch to another carrier’s network like Sprint on T-mobile in the absence of Wi-Fi accessibility.

Looking at the current approach towards the mobile communications, market experts have named the same as “Wi-Fi first”. Although this is an approach, which is being followed by many companies like FreedomPop, Republic and even scratch Wireless. Recently Freewheel was launched by the Cablevision Company with the sole aim of providing only the Wi-Fi service.

Impact on the Wireless Industry: 

This announcement from the web giant has comes as a boon for the wireless industry, for the Wi-Fi First industry and not to forge the customers who are associated with the company. As if now it has been reported that typically a customer has to pay around $10-20/GB for data usage however we all know that the conventional cell phone service cost around $140 per month for every subscriber and according to the study conducted by Validas in 2013, every smartphone user ends up wasting nearly $28 worth of unused data each month. Contrary to all charges we do not, want the cell phone service is not supposed to be costly.

Reports released by Cisco in 2014 indicated that the company saw nearly half of their total mobile data traffic indicating the usage of Wi-Fi. With the increasing, number of updated and options for Wi-Fi, the company can certainly expect the numbers to go up high to nearly 90 percent. With the 1GB of mobile data costing around $10 to the customers, most of them have shifted to Wi-Fi and thereby saving nearly $700 billion every year.

If the system will get implemented properly, so it’s clear that the Wi-Fi model will have a capability to fill all the gaps and regulate the cellular networks and also end up hurting the business of both Verizon and AT&T around the world.

Google has been able will help the customer saves, so that they can trillions of their money through the Wi-Fi First model. The new project of Google will be able to increase the adoption of this new model through few of the critical steps whereas; the first step is to ensure that all Android devices have built in Wi-Fi First intelligence at outset. This is to ensure that the phones rely on the Wi-Fi system and incase of unavailability switch to the cellular services.

The second step is to ensure constant Wi-Fi connectivity. Although it is available everywhere not every customer uses the same for taking the maximum advantage. Google is aiming to make the entire experience seamless and more towards mainstream. We know that Wi-Fi is a critical part of every industry and given the excellent track record of Google, the has potential of taking down both Verizon and AT&T.

Wednesday, 29 April 2015

Short Clicks Vs Long Clicks


Clicks

Long Clicks – Short Clicks – Determines User’s Satisfaction


Various search engines and Google regulate search result rankings by looking at huge amounts of user data and scrutinise user activity within the results. The search engines examines large sets of user data observing a variety of things in determining user’s satisfaction with the search results which are the long clicks vs. short clicks.

Internal metric Google utilising to determine search success, is time to long click and to comprehend this metric,it is essential for search marketers to assess changes in the search landscape. Moreover it is also beneficial for optimization of policies.

Long clicks take place when users perform a search and clicks on a result, remaining on the site for a long period of time. In the ideal scenario, they do not tend to return to the search results to click on some other result or to reframe their search. They do not return back to the result set immediately in order to click on another option.

Long clicks, in general are alternative for satisfaction and success. A short click on the other hand takes place when a user tends to perform a search and clicks on a result and then returns to the search results quickly to click on another product. They are an indication of dissatisfaction and failure.

Search Engine Result Page - SERP

Short clicks is when a user performing a search gets back and forth between the search engine result page – SERP, for the search option and sites displayed within the result.

In this case, the user does not come back to the result page immediately to click on other search option or to change the search option. From the point of view of Google’s perception the outcome which ends with longer clicks successfully fulfils the search query.

Moreover, a search engine also determines the time taken between clicks as a means of satisfaction which would decrease the chances of providing results which do not deliver valuable content for a specific search query.

This data can be stored for each search query and compared to the normal short click behaviours by a SERP rank page. For instance, if Google finds that the first result for a search query tends to have a remarkable high bounce rate, this information is read as negative signal by the search engine’s algorithm.

Google – Measures Short Click – Leveraging Tracking Mechanisms

Subsequently, the search engine tends to reduce the page’s rank within the SERP in its future algorithm update. For the result to be highlighted, it would perhaps need a statistically above average short click rate.

Google tends to measure short click activity by leveraging the prevailing tracking mechanisms. The long clicks are important to Google since it provides a means to measure the satisfaction of the result established on downstream behaviour.

 Google is aware that the search algorithm is still not quite smart and could tend to make mistakes. This could often lead to wrong path by forceful search engine optimization.

Feedback mechanism is provided by long clicks, a kind of human quality assertion which could be lacking in the algorithm. The talk from Google could be a part of the algorithm, though what the future holds in store remains to be seen.

Tuesday, 28 April 2015

Xiaomi Mi 4i First Look

Xiaomi Mi 4i
The Xiaomi Mi 4i is the latest smartphone that has been launched by Xiaomi in Delhi. This launch was carried out in the presence of founders Lei Jun and Bin Lin. The keynote master was played by Hugo Barra. Even though the price of the smartphone has taken everyone by surprise as being quoted at 12,999, a lot of details of the Mi 4i were already leaked out in the market much before its launch. These details included information about display, resolution, chipset, camera specs, and many more. Even though the smartphone does have many features similar to that of Mi 4, the phone comes with lots of improvements in terms of Android 5.0 Lollipop, light body and large battery and so on.
  • Build and Design: 
  • The smartphone has a polycarbonate rear cover, which is completely non-removable as it wraps around the phone giving a completely cornered and smooth edges. This cover has a non grease coating giving it a fully soft touch matte finish. According the company the smart phone users will also be able to remove the stains that have been made with a permanent marker. Both the power or the standby button and the volume rocker have been made of steel and mounted on the right side of the phone. The left side of the phone has the dual Sim tray. The phone is about 130gm in weight and around 7.8mm in width. This makes the phone very comfortable to hold despite its 5-inch display. The company has used the One Glass Solution to enhance the slim factor.

  • The Display:
  • The phone comes with a 441ppi pixel density with the feature of HD display however; the Sunlight viewing technology will help to balance the light in the sunlight.

  • Storage and Connectivity:
  • The phone has 16GB of internal storage out of which a user is allowed to use 10.92GB. At present, the company has not provided an option of adding microSD.

  • Chipset, OS and RAM:
  • The smartphone comes with a Qualcomm Snapdragon 615 SoC powered by an Octa core processor. Customers will get a quad core Cortex A53 that has been clocked in at 1.7GHz and 1.1GHz. The phone comes with Adreno 405 GPU paired with 2GB LPDDR3 RAM. The company promises that there will not be any lag observed during switching between different apps. The look is not very much different from the preceding phones. With the dual Sim facility, the phone can manage even 4G SIM cards in the slots.

  • Battery:
  • According to the company, the phone has been powered with 3,120mAh battery, which is known as the highest capacity battery from the company to work in a 5-inch device. The company also claims that a single usage can go up to one and half day. The phones can get charged to 40 percent within an hour.

  • Camera Quality:
  • The camera has been kept similar to that of Mi 4. It relies on 13MP Sony sensor and has a 5-element f/2.0. The front camera is about 5MP camera, making subtle changes to enhance the quality of the front camera.

Moore's Law: Beyond the First Law of Computing


Moore
Moore’s Law – Reflection of History of Computing Hardware


Moore’s Law is a reflection of the history of computing hardware and the number of transistors in a dense integrated circuit that has increased tremendously every two years. This has been named after Gordon E. Moore, the co-founder of the Intel Corporation and Fairchild Semiconductor which in 1965 was defined doubling every year, the quantityof components for each integrated circuit.

It is not guaranteed that computer chips would keep growing smaller and more powerful though it would be getting near to capacity. The most powerful computing machine in the world could also be the most universal – the human brain which can perform computation on a scale which the most advanced super-computers would be unable to match.

The neurons tend to act on millisecond timescales, though slow when they are compared to the fastest processor and fail to activate. What enables the brain to function is parallel computing, the capability of solving problems at the same time with several different parts of the brain and the focus to imitate the brain’s potential to parallel compute is not only the promise of improvement in computing but also recovery from the imminent death of one of the laws in modern history.

Chips’ First Getaway – Apple Newton (1993) 

It is assessed that there are around 1.4 billion smartphones on the planet and though most of the prevalent devices have been designed in California or South Korea, the chips which power them are designed in Cambridge, England.

ARM, the company which is behind it all may not be compared to Intel but their chips are better than their U.S. rival, on energy efficiency as well as size that is crucial for smartphones. Supporting ARM chips is a simplified approach to computing, which was first considered at the University of Stanford and the University of California, Berkeley.

According to professor of computer engineering at the University of Manchester in England, Stephen Fuber, he states that `this is one case of U.S. academic research being taken up by a U.K. company very successfully’. In 1980, he had designed the ARM chip, at the present defunct ARM predecessor company, Acorn. The chips’ first getaway was in the Apple Newton in 1993 while the rest is said to be history. Fuber states that `then and now, the Apple brand was magic and opened doors’.

Inevitable Technological Progress

ARM, like all other companies has been dealing with the end of a trend which has provided more powerful computers every two years for almost half a century. The term Moore’s Law represents inevitable technological progress though it is a very explicit observation on computer chips. Gordon Moore, in his seminal 1965 paper, had predicted that the number of transistors on a chip would double every 18 months and the cost would fall off at the same rate.

While they are speculating on why transistor density has followed this exponential path, its effect is doubtful. According to professor of complexity economics at the University of Oxford, Doyne Farmer, he states that `it has been a windfall’. The laws of Physics tend to threaten to end that windfall.

 In other words, transistors have to be made from atoms and as the transistors shrink to pack it in a chip, one ultimately reaches a limit where there is shortage of atoms and before running out of atoms, the reliability of tiniest transistors tend to fall while the cost increases due to the increased complexity as well as the difficulty in producing them.

Monday, 27 April 2015

Google Changing its Influential Search Engine


mobile friendly
Google is planning to change the way its influential search engine endorses websites on tablets and smartphones which is expected to control where most of the individuals shop, eat and obtain information. The formula to be revised is scheduled to be released soon and would be favouring websites which Google considers as `mobile friendly’.

Websites that do not tend to fit into the description would be demoted in Google’s search results on the tablets and smartphones and those qualifying for the same would be more likely to appear at the top of the rankings which will be a prized position enabling them to translate into more visitors and money.

Though Google’s latest formula would not be affecting searches on the laptop and desktop computers it would be having a large effect on how and where individuals spend their money, now that most of them have been relying on their smartphones while comparing products in stores and in locating restaurants. Hence, Google’s new rating system is being billed by some of the search experts as `mobile-geddon’.

According to CEO of websites building service Duda, ItaiSadan, comments that `some sites will be going to be in for a big surprise when they find a drastic change in the amount of people visiting them from mobile devices’.

A Substantial Change to Its Mobile Search Rankings

As per Matt McGee, editor-in-chief for Search Engine Land, this could be the most substantial change which Google Inc. has made to its mobile search rankings, a trade publication which follows every tweak that the company makes to its secured guarded algorithms. A little more information on why Google is doing it and what is happening –
  • Websites need to be designed in order to load quickly on mobile devices to stay in Google’s good style. The content also needs to be accessible easily on scrolling up and down without the need to swipe left or right. It would also be helpful in making purchases or taking other functions on the website could be seen easily and touched on small screens.
  • Websites, if designed only with PC users in mind, graphics would take long time to load on mobile devices where the columns of text would not accommodate on the small screen making the reading a bit uncomfortable. Google has been advising websites to cater to mobile devices for several years since it is where individuals are progressively navigating for information
  • The strength of mobile searchers in the U.S. has increased by around five percent wherein inquiries on PCs have been decreasing as per research firm comScore Inc. It estimated that around 29 percent of most of U.S. search requests, about 18.5 billion, in the final three months of last year, were made on mobile devices. The bulk of searches was processed by Google with two thirds in the U.S. and more in many other countries
Websites to Comply with Google’s Mobile Standards

Google has revealed its plans almost two months back, to reduce complaints. It also created a detailed guide and a tool to test compliance along with the new standards. The company faced disturbance over the past changes in its search formula.

In 2011 and 2012, two of the bigger revisions done were focused on clearing out misleading websites. Though the attempt sounded reasonable, several websites complained that the company’s changes unfairly demoted them in the ranking and made their content difficult to locate. Though most of the major merchants as well as big companies have websites which tend to meet up with Google’s mobile standards, it is considered that the new formula is likely to hurt millions of small businesses which do not have the money or incentive to familiarise their sites for smartphones. McGee is of the opinion that `a lot of small sites do not really have reason to be mobile friendly till now and that it would not be easy for them for these changes’.

Search Formula – Variety of Factors 

The Search formula comprises of a variety of factors in determining the rankings of its results. One being the most important consideration is that whether a site contains the most relevant information pursued by the search viewer. New striking order in Google’s mobile search could demote some sites to the back pages of the search results inspite of their content being more relevant to a search request than the other sites which tend to be easier to access on smartphones.

White Andrews, Gartner analyst states that this could be a very unfortunate significance though justifiable since the viewer would not be inclined to view other sites which would take a long time to open or difficult to read on mobile devices.

He adds further that `availability is part of relevancy and a lot of people are not going to think something is relevant if they cannot get it to appear on their iPhone’. Neil Shah, research director for devices and ecosystems with Counterpoint, a technology market research firm comments that `in the end, Google is an advertising platform company and its success is based on the success of its digital marketing customers’.

Thursday, 23 April 2015

Twitter Cuts Off DataSift To Step Up Its Own Big Data Business


Twitter
In the recent push to generate the more revenue, leading micro-blogging site Twitter is building up its own business in the areas such as; commerce and advertising however; on last Friday company sets its move in another area which is big data analytics. Whereas; twitter has announced that soon it is going to terminate the agreements with all other third parties to resale the Firehose data, which is complete stream of Tweets which is related to metadata.

Not onwards Twitter, will use its own in-house based team for big data analytics, which is developed after the acquisition of Gnip in the year 2004, and now seek to develop the direct and most effective relationship with big brands, data companies and others to measure the latest market trends and the analysis of consumer sentiment with other trends which can be better understood by tracking the online behavior of users.

DataSift is one of the biggest companies which will be affected by the latest move of Twitter services, unfortunately this move is quick and DataSift was unable to post their own reaction on the issue of termination.

However; NTT Data, which is one of the company which deals in the Tweets of Japanese language is still listed as the Firehorse partner of Twitter at the time of announcement, but after few days the sources of Twitter has confirmed that NTT Data will be also affected by Friday’s announcement.

As per the statement of Nick Halstead, who is CEO of and founder of DataSift, My company was blindsided by Twitter’s announcement and this announcement was without any warning to DataSift. The more he added that we are discussing the renewal of the deal however; after being the Firehorse partner of Facebook and Twitter’s, it will not impact the business discussions.

However discussion is still I the middle of negotiations as Twitter wants them to be the partner of an open ecosystem. The all you can analyze by the last year move of Twitter when it has acquired the Gnip, which is one of the Firehose reseller and now competed against the DataSift.

The main reason behind the acquisition of Gnip was that Twitter believes that the best way to support the distribution of data of Twitter is to have its data customers with direct data relationships as most of the big data analysis is using the Twitter’s data and its analysis for developing the analytic solutions. Acquisition of Gnip was the first step of Twitter to develop he direct relationships with its data customers.

It’s clear that now social networking platforms are looking for more clear ways to use the data for analytics that seems an obvious process to do that. Still DataSift can work with Twitter. If any of the third party will buy data of Tweets from Twitter and they can supply it to DataSift to develop the new algorithms. Currently, company is in the process to raise a series of new funding as till now they have raised about $ 78 million.

Facebook Blames 'Bug' for Tracking Non-Users

Facebook
Recently, Facebook made it clear that they could have tracked the non-Facebook users who are visiting the third party websites and links through the “Like” button of Facebook as it can embed without the permission or knowledge of those web users who are not using Facebook, but it was just due to “Bug” which is now being fixed.

The issue comes in existence when an academic report of Belgian Privacy Commission was commissioned by calming that Facebook, is violating the law of European Union as they are using the tracking methods and other contract terms.

According to blog post of Richard Allan, Vice President of at Facebook for Eurpian policies, At Facebook we are totally dedicated to dispute the recent findings of Belgian report that claims the Facebook is not providing a legitimate way for its users to opt out of being shown different advertisements to them on the platform of Facebook’s personal behavior techniques.

A team of privacy engineers and experts that analyze the recent findings of Belgian researchers stated that, we have find that report has get wrong on multiple points in analyzing and asserting that how Facebook is collecting and using the user’s personal information to provide better and relevant advertisements to more than billions of users in all over the world.

However; in response to claim that Facebook want to use the Social Plugins to add the files of cookies on the browser of people who are not using the Facebook website in any form, Allan stated that after a research Belgian researchers identified “datr” and thought that it is cookie, which we are using to track the non-Facebook users whereas; later they identified it as Bug.

He said we are not interested to track the non-Faccebook users as it’s not our practice, but researchers found that these bugs may have sent the files of cookies to some of the users who are not on Facebook.

As per the Facebook officials, this is not our practice to attach the cookies on the browsers of those people who are not using the Facebook and neither visited Facebook.com for sign-up purpose, but they has visited the Facebook site with Social Plugins.

However; Allan claimed that Belgian report has a number of misstatements such as; We provides an opportunity to opt out for behavioral and social advertising, while report is claiming that its misleadingly and tracking and violating the standard Web impressions.

However; most of the social networks use the cookies in three main ways, which is to personalize the Facebook platform for real time experience for existing users, to enhance the user security and for the purpose of advertising practice. Facebook is transparent about its cookie polices and they have long disclosed the use of cookies which meant to improve the real experience of users.

Facebook, Google and other major internet platforms has come under more scrutiny that how it tracks users to serve the ads for online products and services as they have got the more refined techniques with the help of purchasing and browsing behavior of any internet user.

Reusing Technology


macbook_pro_antenna
There have been numerous companies that have provided consumers with particular pieces of technology that have been refurbished. While this may not be a new idea, it is one that is important to consider. There are many benefits to reusing technology.

Connecting People and Items

It seems as though technology is not made to last. Dishwashers and laundry machines may have an effectiveness expectancy of only a few years. When a new Motorola or an Apple iPhone is released, people tend to get rid of their old phones in order to get a new one. While there can be understanding to new technology being popular, many older models can still be effective. Not everyone needs a phone to connect to the Internet. Not everyone needs a computer to play a DVD.

It can be quite helpful for electronic devices to be connected with people who have a need. An older phone may be ideal for a senior citizen who is living alone or would like a phone to keep in the car in case of emergencies. An older computer may be ideal for a college student or a new family to America who simply wants a computer that connects to the Internet and can create documents. Whatever the situation is, there is a need for older pieces of electronics.

Reusing Older Models

From time to time, older models and devices may stop working. While it can be easy to fault a manufacturer, the reality is that wear and tear is prominent factors that cannot be overlooked. Fortunately, there are a number of retailers and businesses that provide the parts or replacement parts that are needed. Suppose an individual has a need for some Macbook Pro replacement parts.

There are plenty of places that can be contacted by phone or email that can get the parts. Technology has certainly made it easy in terms of putting in a number, such as 661-4426, that a database will recognize as to whether a part is in stock. Finally, there are techies and other professionals who are able to use the replacement parts in order for a particular device to work once again.

Reusing technology can have a powerful impact on the world. It can decrease the amount of waste in the world. Furthermore, it helps people get information or work done that is necessary. Older technology can still function well today.

Tuesday, 21 April 2015

Hackers Who Breached White House Network Allegedly Accessed Sensitive Data


Whitehouse
Hackers Breached White House Network


According to recent story published by CNN, Russian government hackers have breached the White House’s computer systems late last year and have gained access to sensitive details though the US officials disagree with it. The officials had stated earlier, that in October, the White House breach had only affected an unclassified network, though sources informed CNN that the hackers had gained access to real time non-public details of the president’s schedule.

 The sources also informed CNN that the hackers were the same ones who were behind a damaging cyber-attack on the US Department of State at the same time last year, which forced the department to close down its email system for an extended period of time. The connected cyber-attack on the State Department recently has been characterized as the worst hack on a federal agency. The White House is not unfamiliar to attacks from foreign spies.

 The Chinese have been associated in many high profile attacks of White House unclassified systems together with employee emails. Reports of the breach came in as government official have become more concerned with regards to cyber threats from Russia. James Clapper, FBI director informed Senate committee in February that `the Russian cyber threat is more severe than they had earlier assessed’.

Immediate Measures to Evaluate/Mitigate Activity

Ben Rhodes, White House deputy national security adviser stated that the breached White House system had no sensitive data. He informed CNN that they had an unclassified system and a classified system, a top secret system. And that they do not believe that their classified systems were compromised.

A White House spokesperson who tried to restrain the report informed that it was based on a security breach which was already revealed to the public. Spokesperson, Mark Stroh, informed the media, that this report was not referred to a new incident and any such activity was something which was taken seriously and in this case, they had made it clear at that time and had taken immediate measures to evaluate and mitigate the activity.

He also informed that as officials did last year, the US would not comment on who could have been behind the attacks. Investigating the security breaches are the Secret Service, FBI and US Intelligent agencies which according to CNN sources say were the outcome of one of the most sophisticated cyber-attacks that was ever directed at US government agencies.

Theft of Private Data – Government/Corporation/Individuals 

The recent report comes amid hacker thefts of private data related to governments, corporations as well as individuals, from sensitive emails to medical reports to financial information and possession of these data could tend to be of great importance to either enable criminal acts or assistance in government spying.

As per a senior department official, none of the department’s classified email system in the State Department breach was affected at that time though hackers used that breach to break into the White House’s network as reported by CNN.

The security researchers were under suspicion after the White House security breach was revealed in October, that hackers working for the Russian government were the cause of both the attacks according to the story of Washington Post and inspite of efforts beingmade by the State Department to safeguard its security, hackers were capable of accessing the system with the result that the network was owned for months by Russian hackers.

Monday, 20 April 2015

Lenovo ThinkPad X1 Carbon : Gadget Review


ThinkPad
Lenovo brings the latest iteration of its famous ThinkPad Carbon series of laptops which simply aims at providing an alternative between fashionable and business laptops. ThinkPad series has kept up with the successful trend of providing a slim and light weight laptop powered with cutting edge technology.

The Design And Feel Of The Laptop

The Lenovo ThinkPad X1 Carbon weighs just above 1.3 Kg and has a thickness of 18.46 mm which makes its extremely easier to carry around. This is decent laptop which takes much of neither space nor does it is bulky enough to make it gruesome for rough handling.

Lenovo had provided a wide array of ports around its edges, the charger point, HDMI output, Mini-Display Port; two much needed USB 3.0 ports and a standard 3.55 headset socket. For internet connectivity via broadband service it provides for the Ethernet outlet. Unfortunately this laptop doesn’t have any card reader which some of its latest models possess SIM card slot to provide integrated 3G connectivity.

Multiple Choices Available In Terms Of Specifications and Software

Lenovo brings in multiple variants in the new ThinkPad X1 Carbon to fulfill the needs and demands of different consumers like before. The variant reviewed by us had the latest Intel Core i7 processor running at 2.4GHz and possessed an integrated Intel HD Graphics 5500. This was a high end laptop with an 8GB of RAM and a Solid State Drive (SSD) of around 256 GB out of which 216 was available for the users.

The best features feature of the laptop is its 10 point multi-touch support amazing 2560x1440 pixel Full HD screen. Using Windows 8 OS on this laptop gave a complete different feel with its attention to detail and crispness in picture display. On the connectivity front Lenovo has provided the standard Bluetooth v4.0 along with WiFi b/g/n as well as some of the models can work on 3g networks with SIM connectivity. The battery capacity is set at 50Whr which claims to give a run time of around 10.9 hours but within certain usage condition.

Lenovo has somewhat ruined the software appeal with presence unwanted pre installed software such as the PC App Store, Lenovo PC Experience and others. Even the preinstalled Norton Internet Security suite is quite bleak in nature and often bombards the user with register requests.

A Sublime and Robust Performance

Lenovo ThinkPad X1 Carbon is a flagship business laptop which as its predecessors provides a robust performance. The boot up process is quick and doesn’t have lags or slowdowns while performing day to day tasks. Heavy apps and software are most likely to run sublimely on this laptop and it will provide a memorable experience as well.

Lenovo ThinkPad X1 Carbon range of laptops is designed with a different purpose by the Lenovo where it is looking forward to address the needs of the users by giving a consumer friendly product. There are far superior looking business laptops available in the market but Carbon series stand out for its implantation of such features which makes it extremely easier to get things done quickly. This laptop offers splendid display, massive battery and better features at a price of Rs. 1, 75,000.

GoPro Hero4 Black Review: The Best Action Cam Gets Upgraded


GoPro_Hero4_Black
The main difference between a normal camera and an action camera lies in its form factor. The action camera can be placed on a surfboard out on huge waves, strapped onto dogs, ceiling fans and flying drones without the risk of damage. In fact, they are made for such purposes, where the use of a normal camera will be bizarre and impractical.

The GoPro is a tiny box like action camera with an amazing picture quality and other features added by GoPro and other third party companies. There are two models: the GoPro Hero4 Black and the GoPro Hero4 Silver. There are subtle differences between the two. The Silver costs a 100 dollars less than the Black, as the Black one has additional features and a faster processor. It has the best video quality and more shooting options, as it supports a wide range of mounting options.

The History

This series of action camera has been in existence since the year 2004. The GoPro Hero range of action camera has been the capture of adventure sports since that time. Initially the photographs used to be captured on still shots on 35mm film. GoPro can be called a pioneer of sorts of the action camera. The years hence has only resulted in advancement of the original idea.

The Exterior

The main aspect of the exterior of action camera is that it should be inconspicuous. Therefore the core design and the ergonomics haven’t seen much change throughout the years. Its dimensions can be measured as 41X59X30mm. The camera weighs 88g with the exclusion of its waterproof cover. After the cover is added on, it weighs around 152g.

A small LCD is located at its left side and the lens protrude from the front. Next to the LCD, there are two LED indicators- the top one blinks blue to indicate the WiFi signal and lower one shows the camera status and also works as the battery indicator. The power up button is located beneath the LCD and it also can change the modes.

There is another camera status indicator located on the rear of the action camera and a port, for the purpose of accessorizing the camera with an external LCD monitor and LCD Touch BacPac.

The right side plays host to the Micro-HDMI port, Micro-USB Port as well as the MicroSD card slot and a speaker. The battery door is at the bottom, a tiny slip of paper which is attached to the battery to pull it out.

The Interior

This action camera is equipped with a 12 megapixel 1/2.3 inch CMOS sensor as well as an F2.8 aperture ultra wide angle lens. It can capture 4K (3840X2160) footage at 30fps and also shoot 1080p video at up to 120fps and 720p footage at 240fps. It has a mono microphone for the purpose of recording audios, plus there is a provision of an optional 3.5mm input accessory for a better external mike if required. For recording better audios, there is an added integrated analog-to-digital converter present.

Producing Valuable Content: How to Get Stared


SEO
Every SEO expert will tell you that the key to a successful SEO campaign is valuable content. What they often forget to mention is the very definition of valuable content and how we can make them. Producing valuable content doesn’t have to be difficult as long as you know the basic guidelines to follow. To help you get started, we are going to discuss some useful tips on how to produce valuable content in this article.
Useful and Informative

The two aspects to consider when writing contents for your site are usefulness and the information you are adding to the contents. First of all, the article needs to offer some use to the visitors. A good way of making sure your site’s contents are useful is by placing yourself as a visitor and asking these questions:


  • Did I get tips and tricks from the article?
  • Is the article amusing?
  • Can I do something new with the information I get from the article?

Aside from being useful, an article should also be informative. News, updates and other contents that fall in the same category may not always be useful, but they can always be informative.

Concise and Readable

Here’s another set of aspects to keep in mind when developing valuable contents. Long articles are not always good, especially if most of the actual contents are not useful or informative enough to your users. Keep it short and concise to maximize the value of the content itself.

I’m not saying you can’t write long articles; as a matter of fact, you can. However, long articles need to be kept readable for them to be as valuable to users as shorter, more compact articles with a lot of information. To help make long articles more readable:

  • Divide the article into paragraphs and state what the paragraph is all about in the first sentence. Most of your users will scan through paragraphs quickly, so mentioning your point at the beginning of the paragraphs really helps.
  • Furthermore, divide the long article into parts with subtitles and pointers. An article that has subtitles are much easier to scan and are often more readable than those that only include long paragraphs.
  • Use images and visual elements to keep users engaged. Reading long articles online can be quite boring, so visual cues are needed to keep users glued to the screen. It will also make your site more appealing.

Credibility Matters! 

Writing an article for the web is not like writing an article for magazines or papers. Site owners tend to ‘forget’ about citing their sources and providing valid data. This will greatly damage the article’s credibility; of course, it will also damage the site’s credibility.

When you do present data from other sources, site the sources properly. Your users will automatically associate citations with better credibility, causing them to trust the contents you are displaying even more. According to SEO experts from Blue Hat Marketing, using inner-linking to cite other pages on your site is also a great for getting your site’s SEO performance up.

Saturday, 18 April 2015

Build A Mars Base with A Box of Engineered Bugs


Mars_Base
Travel to Alien Planet – Bug Boxes – Engineered Microbes


The next time humans intend to set foot on an alien planet, they may not have to travel alone but travel along with small lightweight `bug boxes’ that could be full of engineered microbes that would make life on these planets much more liveable.

The pioneering settlers would be needing food, fuel as well as shelter in order to survive on a distant world and tugging along bulky supplies from the Earth could be costly. Another option has been offered by Synthetic biology.

It is said that microbes weigh less and the space taken is next to nothing on a spacecraft though once the mission lands on Mars it is said that they could multiply by feeding on the materials that may be available there. The outcome of their labour could provide the essentials for human settlement.

A research has begun by NASA to realise this vision, according to Lynn Rothschild at the Ames Research Centre in Moffett Field, California. Rothschild who is the leader of NASA’s new Synthetic Biology Initiative, aims to build designer microbes for future crewed space missions and shared her vision at the BioDesign Forum, last week in Cambridge, UK.

Synthetic Biology – At Crossroads of Biology/Engineering 

Synthetics biology lying at the crossroads of biology and engineering has its practitioners building a biological toolkit comprising of chunks of genes, known as biobricks, each performing a certain function of making a bacterium generate natural antifreeze molecule, for instance: Biobricks could be inserted into other microbes to provide that function.

With this approach, a microbe having the capabilities of surviving on an alien planet could become one that can endure human life there. With regards to energy, several earthly microbes would have died in extra-terrestrial environment that are rich in carbon dioxide and nitrogen, the two main elements of Martian air. Anabaena, an ancient cyanobacterium tends to thrive in these conditions though metabolising both gases in order to make sugar.

Rothschild states that `as long as it has warmth and some shielding from ultraviolet light radiation, it could do well on gases in the mars atmosphere’. Anabaena utilises most of the energy produced, from carbon dioxide and nitrogen but synthetic biologist could encourage the cyanobacteria to share its supplies.

Waste – Feed the Microbes

At a synthetic biology competition last year – International Genetically Engineered Machines – iGEM, a team from Brown University in Providence, Rhode Island and Stanford University in California portrayed how inserting genetic machinery from E.coli made Anabaena excrete more of its energy as sugar.

 Moreover they also showed that they tend to support colonies of other bacteria as well on sugar. Such microbial colonies, in theory tend to make oil, plastic or fuel for the astronauts. The team which was led by Andre Burnier, a recent Brown graduate who had been advised by Rothschild had also come with options of supplying human settlers on Mars with mortar and bricks and had started with a bacterium known as Sporosarcina pasteurii that unusually tends to break down urea which is the main waste product in urine, excreting ammonium, making the local atmosphere alkaline enough for calcium carbonate cement to form. The waste created by the astronauts could feed the microbes and they in turn could help strengthen fine rocky material on the planet’s surface to create bricks.

Friday, 17 April 2015

Logitech Bluetooth Adapter : Gadget Review


Logitech Bluetooth Adapter

Logitech Bluetooth Adapter Offers Simplicity Backed By Performance


In the world of gadgets Smartphones are the most talked and bought gadgets which are ruling the world right now. Still there are companies which are making huge sum by providing such gadgets which are necessary yet a buyer doesn’t know it. Logitech, JBL and other have invested a lot in creating high end Bluetooth speakers which offers impressive sound.

Usually the better sounding Bluetooth speakers are expensive and find very few takers. Most of the people are still using the 2.1 or 5.1 channel speaker system which offers perfect sound. Logitech is all set to bridge this gap of reliance of the old technology with its Bluetooth adapter which can easily turn any speaker system into a wireless system.

Logitech Brings Elegant and Compact Design

For a Bluetooth adapter it is size which matters and Logitech has craftily designed an ultra compact and handy adapter. Logitech Bluetooth Adapter measures just 50x50x23mm which can easily fit within the palm and it can easily kept hidden away from the view.

Tough plastic has been used to make it highly durable. This Bluetooth adapter supports Bluetooth v3.0 which a range of 15m and make sure that the connected devices are within the line of sight. A single button for Bluetooth had been placed at the top for pairing with other devices. The rear side of the adapter has 3.5mm and RCA outputs and a power socket. Furthermore a relatively small blue LED light is on the bottom side which blinks at the moment of pairing and remains constant once paired.

Logitech Bluetooth Adapter Brings Splendid Performance

In order to staring pairing process long pressing on Bluetooth button is required. Adapter requires to get paired for the first time only after which it connects automatically. It works splendidly across a wide range of devices like the home theatre system; Bluetooth enabled music players, Apple products like Macbook Air, iPhone and other smartphones.

Furthermore this adapter also gives the freedom of pairing with two different devices at the same time as audio source and users can actively switch between them. Once the connection is established the adapter simply converts the Bluetooth wireless data into analog signals which are transmitted to the connected speakers.

Whether you play music in the device or through speakers connected by the adapter the quality is almost same except for little which can only be picked when listened very closely. The Bluetooth connection remains firm when the connected device is placed in the clear line of sight while being away certain anomalies appear. Hence it will be wise to keep the devices in sight in order to enjoy music.

Give A New Life To Old Speakers With Bluetooth Adapter

The Logitech Bluetooth adapter was initially launched at the price of Rs. 1,999 but in the online sphere it can be easily bought at just Rs. 1,200. Now price shouldn’t be a concern for the enthusiastic individual who is looking for an alternative to enhance the capability of their age-old music systems.

Leaked Details, If True, Point To Potent AMD Zen CPU


AMD Zen CPU
It has been more than a year that the potential new about the next-generation CPU architecture from AMD, and it has been code named which known as Zen however it is running in the fantasy of the fans of the company as well as among the people who are looking for more efficient competitor when compared to Intel. The technical configuration can differ, but all geeks should be ready to see the next generation chip full of wallop. The leaked slide is only a part of the highest-end part.

Even though the slide has manage to capture more than required focus, details of the chip might have drastic changes in terms of what it appears now and how it will be at the time of the launch. For instance, the leaked slide itself might not be accurate.

Reports from Fudzilla indicates that this new CPU will be able to offer nearly 16 Zen cores, in which each of the core will be further supporting 2 threads out of the total 32 threads. Rumors also suggest that this new core will be using Simultaneous Multithreading, which is completely opposite to the Clustered Multi-Threading on which the AMD has debuted as part of the Bulldozer family and has been in use for the past 4 years.

Every core has been backed up by 512K of L2 cache and with 32MB of L3 cache will be seen throughout the entire core. This clearly depicts that the L3 structure has been inspired from the Bulldozer. If AMD will be come over from the cache there might be a improvement in the performance. The new integrated GPU will also be offering double-precision floating point at every 1/2 single-precision speed.

Is it too good to be believed? 

The CPU layout shown in the leaked slide makes a lot of sense to everyone. People can clearly see at the modular part and AMD part wherein the Zen module consists 8 threads and 2MB of L2, some unbelievable L3 cache as well as of 4 CPU cores,. Nevertheless, what has been able to raise eyebrows among all the fans includes 64 lanes of PCIe 3.0, HBM interface, and quad-channel DDR4.

The reason being, the highest end servers from Intel pack comes with 32 PCI-Express lanes and even though the Quad-channel DDR4 is available, Intel can only support 4x DDR4-2133. You have to believe that HBM has arrived and the day is not far when it will integrate with servers and desktops since 16GB of HBM memory makes a whole lot of difference. However the Zen’s 512GBs memory interface is one of the most important achievements. This CPU is expected to be unveiled in another 12 to 18 months in the world of server.

AMD will be able to create history if they are able to meet the power consumption and IPC targets. Still the details have not yet been confirmed. It is expected that in coming days it will be big if company will be able to launch the device with 64 lanes of PCI-Express, a totally revamped CPU, integrated graphics processor and 16GB of HBM memory.

Thursday, 16 April 2015

Why is My Website Loading Slowly


loading
As a website owner, you likely already know the importance of ensuring your site loads quickly. You know that even mere seconds can make all the difference between your visitors sticking around and going elsewhere to your competitors. You also know that if they do jump to a different site, there’s a strong likelihood that they’ll never return to yours.

Yet if you’re reading this article, what you probably don’t know is what’s currently slowing down your site and driving them away in the first place.

Indeed, it’s all well and good knowing that your website is currently performing slower than it should, but let’s be honest, it’s nigh on impossible to do anything about it until you’ve discovered where the issue lies.

To help you get to the bottom of your site speed issues, here’s a look at some of the more common reasons why your website could be loading slowly.

It could just be your connection 

We don’t mean to insult your intelligence, but it’s often the most obvious issues that we tend to overlook. Before we go any further, it’s worth double checking that it’s not your own internet connection that’s causing the problem.

You could use a tool like Speedtest to check if your connection is performing as it should. Alternatively, for a more practical solution, check out a few different sites and see if they’re loading as quick as you’d expect. If you can, try loading your site using a different internet connection.

If it’s still slow, and if those other sites load much faster than yours, it’s time to explore other possibilities.

Problems with your hosting 

These days, most web hosting or website builders promise unlimited bandwidth for your site, but it’s worth pointing out that unlimited in this case usually comes with a fair use policy. If you’re regularly exceeding your bandwidth, that could be causing the problem, in which case it might be time to upgrade

That said, it could be another issue entirely with your hosting account. It’s certainly helpful to check your hosting company’s status updates to see if they’re aware of any short term problems.

If not, you may be left with two options. First, contact your hosting provider to see if they can identify what’s causing the issue and if there’s anything they can do to help. Alternatively, you might be left with the last resort option of packing up your site and moving to a better host with a reputation for faster loading speeds.

Large image sizes 

Those huge images you uploaded to your site may look awesome, but if they haven’t been properly optimized for the web first, they can have a big detrimental impact on your loading times as the larger the image file size, the longer it takes to load those images. If you’re using Wordpress or another Content Management System, there’s a range of plugins out there that can help you to optimize your images for faster loading speeds.

If that still doesn’t solve the problem, you might want to explore replacing those images with other, smaller ones to see if that does the trick.

Your website code 

As a browser gets to work on loading your site, it needs to trawl through all that complicated back-end code to present your site the way it was intended. As you can imagine, that takes time, and again, the longer it takes, the slower your load speed.

For CMS users, this is where caching plugins come in real handy. Install one on your site, and they’ll get to work on creating static html pages that serve up your pages much quicker.

Who Makes The Most Reliable Hard Disk Drives?


Hard drive
The most daunting question for any PC maker and owners is as to who is best hard drives maker? This story is being constantly followed by the unlimited online backup company called as the Backblaze. The company has released some new data for year 2015 and it has been able to shed some incredible light of these manufacturers.

From the perspective of Backblaze, they are looking into these information as even they want to ensure that they buying the best hard drives available in the market. The company at present has around 410002% of these drives. It is hard to imagine that every time a hard drive fails, it takes time and effort to pull another drive, insert in the slot, and rebuild the entire RAID.

So which hard drive manufacturer is the most trustable? 

The data released by Backblaze has divided into two parts; search can be made either by means of specific drive or by means of the name of the manufacturer. The reports suggest that the company that as of December 2014, the company had 22,902 drives of Hitachi, 47 drives of Toshiba, 1,174 drives of Western Digital and 15,528 drives of Seagate.

Although these drives have been bought by the company in different years, the company installed thousands of new 4TB and 6TB models in the last few years. Here if we notice the company considers reliability a second important factor and considers the drives efficiency as the main factor. They buy the most competitive drives, which offer dollar-per-gigabyte ratio.

chart
According to the data, Hitachi seems to have the most reliable product in the market. Backblaze’s Hitachi drives have the failure rate of only. When we compare the HGST drives, the failure rates have been recorded at only 1.4%. After operating for three year the drives from Western Digital, might be impressive as nearly 92.4 per cent of their drives are still in operation and they are functioning well.

In comparison to this, the 3TB drives from Seagate are disastrous, as their 40% drives have failed throughout 2014. Even though the company recorded the failure rate at 9.6%, their introduction of 4TB drives have increased their performance. These drives have failure rate of 2.6%, which is by far better than the 3TB drives.

Which is the best and worst single hard drive? 

4TB models will be the best in terms of reliability and value and these are being offered by Seagate and Hitachi. However, if you are looking for longevity, users will have a long list of options. The Hitachi GST Deskstar 5K3000 3TB, HGST will guarantee that you drive doesn’t die until 3 years. In terms of 4TB, Hitachi Deskstar 5K4000 is one of the best options. People need to be aware of 3TB 7200.14 model from Seagate which has been rated very poor in this report.

Considering the fact that Backblaze does not many drives installed from Toshiba and Samsung, it is hard to derive meaningful numbers. While Toshiba and Fujitsu might be a reasonable choice, Seagate acquisition of the hard drive division of Samsung raises questions on the reliability. We might have to wait for another report on this in the future.

Editing Your Corporate Video: What You Need to Know for a Satisfactory Result


So you already have your corporate video for your company - and whether your company is based in London or another area, you know how important it is to add credibility to what you have to offer. After months of hard work and planning, you now have something to show for all your efforts, and can honestly say that you have done your best to make a brilliant video. But, it doesn’t end there – you still have to consider the editing of your video in order to make it as perfect as can be.

But even if you rely on a professional video editing company for corporate video production in London, when it comes to editing your corporate video, it still pays to have a basic idea of how editing works and how it can be used to your advantage. If you know what you want, then you can more easily impart this to the editing company, and everyone will be more-than-satisfied in the end.

The different styles of video editing

Tower Bridge
There are basically two different styles of video editing: non-linear and linear. Non-linear, also referred to as ‘film style’ video editing, is more appropriate for short video projects which are also more structured. With non-linear video editing, you can expect the different segments of the video to be organised into various clips, so that all the shots can be easily accessed and then edited (lengthened or shortened, moved, or even removed according to your specifications). In addition, non-linear video editing allows the editor to easily cut and paste any changes to the actual audio or video.

More on the non-linear video editing method 

For non-linear video editing, the editor(s) will apply several phases. They will first make use of rough editing, where the original video material is edited by choosing some of the best footage or clips. In this phase, no audio mixes or transitions will be added, as the structure is still in the process of assembly to make sure the story is clear.
The second phase of non-linear video editing is the tight edit, where transitions and music during and in-between scenes are added and the sound is cleaned up as well. This is also where titles and credits are added. The third and final phase of video editing is the final mastering, where the entire programme is played and then recorded onto your preferred medium, be it a hard drive, a DVD, or Blu-Ray, among others.

More on the linear video editing method

The second style of video editing is the linear method, also known as traditional video editing. With this, the editing process involves copying the video and the audio from one tape to another. Keep in mind that with linear video editing, changes cannot easily be undone once the editing is finished. Linear video editing is admittedly more complex and time-consuming than non-linear video editing, as the editor has to go through metres of tape just to find a particular shot, and the editor needs to have a very clear concept of the order of the various shots in order to minimise editing time. However, linear video editing is more ideal for longer video projects (30 minutes or more).

Corporate video production experts such as Raw Productions are also quick to point out the importance of labelling during the editing process, which will then be included in the final printed media for future information and use. These labels include titles, sub-titles, the list of producers, and the length of the final video.

Having a good idea of what video editing is all about – and what to expect from it – will help you determine exactly what you want for your finished product – and in the end, you can rest easy in the knowledge that you have a brilliant corporate video in your hands.

Monday, 13 April 2015

Project Composer

Project_Composer
Credit:techcrunch
In the past few years, there has been a tremendous growth in the use of cloud file sharing services. From what it looks like from the rumours in the market, Dropbox seems to be looking forward to bring back the HackPad.

Dropbox has acquired HackPad, which is a collaborative documents service company a year back. This Y Combinator-backed start-up grew up to be the most preferred choice of the people and the most important tool for taking down notes at any event, conference apart from the classroom.

The main reason for the immense popularity of this tool was attributed the simple design, easily usable, and had the nature of being real-time. Now out of nowhere we have the new entrant with the similar feature as well as the remarkable resemblance to the HackPad has appeared in the market and this is called the Composer.

The project composer: 

One of the users of the product Hunt spotted out this project. This is the websites which highlights the best products available in the market based on the voting of the users from the community itself.

As of Today, Adam Waxman, from the SeatGeek product posted out a link to the composer however most of the users replied that they are unable to get access to this link. Even though the app is currently available on the domain of Dropbox.com, looks like the name has been taken from an internal ongoing project.

This project will first request for authentication from the users to access their files and folders of the Dropbox account. Even though most of the people are able to clear the initial rounds of authentication for the composer, they end up getting the error message indicating that they are not allowed to use the service as if now.

Just incase the user finds that they have received this message in error, they can send email to the company at feedback@dropbox.com. However one of the users was able to get access to the composer and stated that it looks much like Evernote as it allows collaborative note taking. She further added that the composer users will be able to Dropbox files, and tables with their notes, add tasks.

At the end what it looks like is the transformation of HackPad and its integration with the Dropbox. Even though Composer might not be as popular as Evernote but it can still become a cause of concerns for other companies as the users of the Dropbox cloud storage facility will be relying on the composer. Even though the Dropbox Company has denied giving information about the Project composer, they have stated that additional information will soon be released.

Compose has been able to keep the essence of HackPad but the company is working on it to make it future ready. Companies can certainly expect tough competition in the market but they have the advantage of minimal features and ease of understanding. The project has currently received mixed response in the market, but Dropbox is working on the improvisation of the project.




Selecting an Online Marketing Company


There is a tremendous amount of competition in the online business world. In order to get ahead of your competition, you need to be able to market your products and services better than the companies you are competing against. This can be much easier said than done. There are many ways to go about online marketing. Some of these methods have proven to be much more effective than others. The trick is to use the right methods in the right locations for the specific products you are trying to sell. To do this, you need to hire professional online marketing consultants to help you. There are literally thousands of companies to choose from that provide this service. Here is how to go about selecting an online marketing company.

1. Get some quality recommendations

If you are looking for a dentist, you would ask people you trust to recommend a good dentist to you. Recommendations are how were are able to find out about many excellent professional services. The same idea applies to online marketing. Ask people you trust within the online business community to give you the names of marketing companies that they trust and respect. Find out which companies your colleagues have hired before. Hopefully, you will be able to compile a good list of names by doing this.

2. Read online reviews

The list of websites that provide online reviews is practically endless. Therefore, it should not be difficult for you to find some reviews of online marketing companies. Reading reviews that have been written by actual customers will be helpful to you because it will allow you to know what to expect. This information will give you the ability to make a more educated decision when it comes to hiring an online marketing company.

3. Find out the services that are offered

Once you have finished compiling your list of online marketing companies you are thinking of using, you can begin to narrow down your list by taking a look at the various services that are offered by each company. While you may think that the vast majority of online marketing companies will offer exactly the same services, you may be surprised how much they differ. You need to determine exactly what methods of marketing you believe will be most effective for your particular product or business. You then need to find a company that specializes in that method of marketing.

4. Compare prices

The prices that are charged by online marketing companies will tend to vary just as much as the services they offer. Therefore, it is in your best interests to compare as many companies as possible in an effort to get the lowest price.

Saturday, 11 April 2015

Limited Access State for Inadvertent Inputs



Patent

Google Inc. – Patent Application – Limited Access State for Inadvertent Inputs



It has been reported by a New Reporter-Staff News Editor at Politics & Government Week, from Washington D.C. VerticalNews journalist report that a patent application filed on October 28 2013, by inventor Mittal, Sanjev Kumar from Bangalore had been made available online on February 12, 2015 and the patent assignee is said to be Google Inc.

The news editor gained the quote from the background information provided by the inventors – Computing devices such as mobile phones, tablet computers etc. can generally perform various functions like executing applications stored thereon and outputting information like documents, emails, pictures etc.; for display. Some of the computing devices could include limited access state which could prevent an unauthorized user from accessing and viewing applications as well as information that could be stored at the computing device resulting in effectively locking the computing device.

These computing devices helps the user to provide a specific input for instance the passcode, pattern or biometric information in order to unlock the computing device to gain access to the information or application which could be stored in the device. The locking techniques does provide some kind of security ensuring that only users knowing the specific input needed to unlock the computing device could access the application and information that is stored in the computing device.

Computing Devices Constructed to Execute Action 

Some of the computing devices are constructed to execute an action in response to receiving greater than a threshold number of ineffective unlock efforts, for instance, the computing device could prevent extra unlocking efforts for a predetermined amount of time.

In such instances, the device could delete some or even all information that may be stored in the computing device in response to the attempts made of unsuccessful unlock which could be greater than a threshold number and in some other instances, information could be unintentionally deleted from the device.

As an enhancement to the background information to the patent application, VerticalNews correspondents also gained the summary information of the inventor for the patent application and in one instance; the disclosure was directed to a system which included receiving by a computing device functioning in a first limited access state which was an indication of a first input.

Procedure Focussed on Managing Access State

Procedures of this disclosure are focussed on managing access states of computing device and in some applications, it could operate in a first limited access state for instance a first locked state.

At the time of operating in the first limited access state, the computing device could be arranged to compare received user input to certain input which may be needed for transition of the computing device in operating in an access state – an unlocked state.

Receptive to receiving input from user which does not tend to correspond to the certain user input needed to unlock the computing device, it may switch from operating in the first limited access state to a second limited access state.

For instance, in response to defining that the received user input could most likely not be an attempt to unlock the computing device from first limited access state, the device could be constituted to switch to operating in the second limited access state.

Friday, 10 April 2015

Prosthetic Hand Recreates Feeling Of Cotton Bud Touch


Prosthetic_Hand
Prosthetic Hand – Creation of a Sense of Touch

Invention and technology has given rise to a prosthesis which for the first time helps people who have lost a hand due to amputation or accidents,to recognize a natural sense of touch which is often taken for granted. The sensation is so good that a person who had tried the device commented that on leaving the lab, it felt like leaving his hand at the door.

 For ages, researchers had been working on the creation of a sense of touch for those who could have lost their limbs, with intentions that could help them with the control and sense of embodiment of a prosthetic limb. A group of researchers are ahead of the competition with the completion of a successful trial with many others recruiting people for testing the technologies.

Recent developments have enabled people who have lost their limbs to move a prosthesis and grip objects. The ability to distinguish what is touched.is essential in controlling and accepting prosthesis as a part of the body and this was accomplished for the first time by Dustin Tyler, together with his colleagues, at the Louis Stokes Veterans Affairs Medical Centre in Cleveland, Ohio.

Cuff of Electrodes – 3 Main Nerves in Residual Area of Limbs

The team found a means of transmitting long term realistic physical sensation like the feel of a cotton wool ball to two individuals who had lost their hands in an industrial accident.

There have been several attempts in creating a sense of touch in recent years by conveying vibration to a person’s residual limb which can equate to pressure on the prosthesis though it proved to be more distracting rather than helpful and has not been adopted.

Attaching electrodes to the inside of residual nerves was also attempted though the tingling sensation produced diminished over a period of time. Something more complex was then attempted by Tyler’s team and two years ago, a cuff of electrodes around the three main nerves in the residual area of the two person’s limbs was implanted which are the nerves that usually transmits sensory information from the hand to the brain.

Each of these cuff comprised of electrodes which could stimulate different parts of the nerve and the wires that were connected with the electrodes to the machine provided a stream of electrical pulses which was connected to the prosthesis used by the individuals. The system was to operate it, wirelessly.

Real Achievement – Prosthetic Limb 

Tyler states that `as soon as they stimulated the nerves in the first subject he immediately informed that for the first time he felt his hand since it was removed’. The team than switched on each electrodes in turn and the individuals felt a tingling sensation as though it was coming from the tip of their prosthetic thumb then the tip of the index finger till the touch sensation moved across the whole of their prosthetic limb.

 Providing a real sense of touch to one who could have lost their limb is a great achievement and doing the same for one who has been paralysed by spinal cord injury could be more challenging wherein a spinal injury could prevent nerves from coordinating with the brain. Hence communication between the brain and prosthetic limb could bypass these nerves.

A team at the University of Pittsburgh, Pennsylvania in 2012 had implanted a device in the brain of paralyzed women, from the neck downwards to help her to control the most sophisticated prosthetic arm with the use of her mind.

Thursday, 9 April 2015

Hacking Game 'Touchtone' Turned Me into an NSA Spy


Touchtone
Thinking about becoming a NSA spy itself sends out a chill down by spine but after playing TouchTone, I have been eagerly awaiting a chance to get my hands on being a responsible spy. This iOS puzzle game got me hooked up like never before.

I was experiencing thrills of searching of enemies across the state by just scouring across emails and text messages. This is completely different and I experience it in my daily life. As per the statement of co-creator Mike Boxleiter, people might start feeling this entire game very creepy as well as voyeuristic. All they need is to stop for a while and rethink about the game again.

What the game is all about? 

The game starts with a simple training exercise wherein the player has to identify if the chain of email between the potential customer and a bookstore employee should be monitored. At the bottom the game will highlight a transcript indicating if the email has any content indicating potential threats. The player has to identify any kind of threat to national security.

A simple example is the interaction dealing about The Anarchist Cookbook, clearly indicates that there is not much to worry about however any interaction dealing with crafting homemade bombs certainly requires the attention of the player.

Soon after completing the level, I got scolded by my one of the handler who is none other than a Patriot, and he point-out that some students have been emailing about going out to smoke weeds. I had indicated that this mail is not at all threatening.

He stated that from his perspective, and all this was about drug trafficking which required monitoring. According to Boxleiter, most of the people on the Wall Street are more or less criminals. He was not satisfied with my effort and work as I was thinking about all these as a rational human, which is not all like a spy. He stated that our system is so messed up as they are not concerned about people who carry our fraud of millions but focused on people with beards.

This game is about exposing the loopholes in the system which is in use. In simple words we can say that in the world of TouchTone, racial profiling does not exist, which ensures that the spies are searching out for the maximum information possible for anyone known suspects.

The more he added that NSA is about creating connection between different people. In simple words we can say that this game is all about tailing the suspects till he can be put behind the jail. After this the game starts over with another connected suspect.

At the end, TouchTone is a hacking mini-game similar to that off BioShock and Deus Ex. The game is simple consists of 6 x 7 grid with different which are of different colours laser beams meant for directing it to the same-colored targets by means of reflector panels.

Even though the game might look deceptive but can turn out to be slightly exhausting. People might not find this game funny but reflects the normal life of the people. The game requires people to look the concept from a humorous point of view. It is meant for inspiring conversations.

Wednesday, 8 April 2015

Livescribe Echo Smartpen


Smartpen
Livescribe Echo Smartpen = Record Audio/Take Notes/Play Back

The Livescribe Echo Smartpen enables the user to record audio at the time of taking notes and then plays back later. One can save as well as share interactive notes to the computer, iPhone or iPad through a micro-USB connection which also recharges the pen.

The memory storage capacity has 400 or 800 hours of recorded audio based on the model which includes an OLED display that makes the navigation of the smartpen app an easy one. We come across several mobile apps and desktop program offering amazing note taking capabilities though nothing can replace handwritten notes.

Typing could be more efficient but the truth is that writing by hand enables a person to learn better, retaining most of the information in the process which can stimulate the brain. Hence most of the students as well as professionals opt for analog notes instead of keyboards and searchable typed words.

 This has given way for the Echo Smartpen by Livescribe which helps in taking traditional ink notes while at the same time makes a digital copy of everything that is written. Moreover it also records everything heard or said as one writes. If one taps a word, the Smartpen can also play back whatever has been recorded while taking those notes.

Installation of Several Applications

A Livescribe smartpen is the size and weight of a big pen i.e. 5/8” x 6 1/8” and has a removable ball point ink cartridge, a microphone for recording audio, a speaker for playback, a small OLED display, internal flash memory that can capture handwritten notes, audio and drawings and an infra-red camera. The person has the option of recording audio besides the handwritten text.

 The recorded audio is kept indexed together with the handwritten text and on tapping on a written work could playback the recorded audio from that part of the recording done. Besides this, it also enables installation of several applications depending on the memory and ships with several applications.

If the user tends to tap on the correct images, it can perform as a calculator, or can also translate words wherein the translator software shipped comprises of 21 words in a small collection of languages. As of September 2010, there does not seem to be any public plans in making a full version of this translator accessible.

Livescribe Desktop Software – Element of Digitized Notes 

The amazing fact of the Echo Smartpen is that it has a capacity that goes well beyond paper and the Livescribe Desktop software has an element of digitized notes. This software can be downloaded at no cost. When the Echo Smartpen is connected to the computer with the provided micro-USB cord, the Livescribe software tends to sync the notes quickly with the associated audio wherein the notes can be viewed, exported, played back as well as organized into new folders.

Notes that are associated with audio tend to appear in green and on clicking on a word could prompt the audio and a real-time playback of note-taking. The interactive notes have been dubbed as `Pencasts’. When the user intends to share the notes with a colleague, it can be uploaded to the cloud account or saved on the computer and Livescribe enables you to export audio recordings, Pencasts, or the written notes with the help of its desktop software. In order to export notes, one could use the Ctrl-click for selection of the notes intending to export and then right click the selection choosing `Send X Page to …’.

To Memorize Terms – Develop Audible Study Sheet

Then the destination needs to be chosen such as Evernote, Google Docs, and Computer etc. Lastly one needs to choose to export the notes as a PDF, Pencast with a special .PDF format, m4a – audio or.PNG.

For quick access to the idependant audio recording one needs to go to the `Audio’ tab at the top and follow the same process for exporting audio files and one can share long lectures with their colleagues. Should the user need to memorize terms, they can develop an audible study sheet with the Echo Smartpen wherein they could start by writing out the terms needed to study on a single sheet and when completed, could record definitions for the respective term, by tapping `Record’ tap the word, say the definition and tap `Stop’.

The same process could be repeated for all the terms. This is a faster option than making flashcards though one might also benefit more from hearing the definitions. Should the user be without a notebook and would like to use the Smartpen as an audio recorder, they can then do that by long pressing the power button and a few seconds thereafter, the pen will beep indicating that recording has started.

In order to stop the recording, one could tap the power button once and the file gets saved. The file can be played back later on tapping the menu button on any given notebook page.

One can go to Main Menu > Paper Replay > Play Session and locate the recording. At the time of playing it back, any notes that are taken will be synced with the recorded audio.