As we mentioned in our previous post about Amazon cracking down on sellers of unsafe USBs, it’s now evident that many USB Type-C cables are just plain dangerous. With incredible utility does indeed come great responsibility but manufacturers and resellers have not been prioritizing manageable temps and safe power transfers in efforts to maximize data speeds. Fortunately, USBCheck has a solution to put testing for cable integrity directly into the user’s hands.
USBCheck checks current draw when a compatible device is connected to a PC USB port and will tell you if the current draw is following appropriate USB specs and recommendations. While it’s not 100% accurate to keep up with all the devices out there, it gives a strict value which a user can then check against manufacturer power specs.
Earlier this month, tech giant IBM announced a more efficient way to use phase-change memory. This new method is a breakthrough with the potential to transition electronic devices from standard RAM and flash to the much faster and more reliable phase-change memory, or PCM. Phase-change memory is a type of non-volatile optical storage that manipulates the behavior of chalcogenide glass, the same method of data storage on rewriteable Blue-ray disks. Electric current applied to the PCM cells enables a binary modification from amorphous to crystalline structure which is the equivalent of a 0 or a 1 for the purpose of memory storage.
In the past, PCM’s limited capacity and high cost have been barriers to its widespread use but the recent discovery by IBM researchers was a way to triple the storage amount per cell. By operating the cells at high temperatures and studying the cells’ reactions, 3 bits were able to be stored per cell instead of 1. Haris Pozidis, an IBM manager of non-volatile memory research writes, ” The jump is significant because at this density, the cost of PCM will be significantly less than DRAM and closer to flash.”
As computer hardware becomes cheaper and more powerful, leaps such such as NVIDIA’s new GeForce GTX 1080 are why it has never been a better time to be a PC gamer. The rise of virtual reality and the growing 4K trend makes the power and efficiency of the GTX 1080 heavily desirable. Not only that, but with a $599 price tag and claims by NVIDIA of faster speeds than the $1000 Titan X, the GTX 1080 is an incredible value for the most powerful graphics card on the market.
Running at over 1600 MHz and with 8GB of Micron’s new DDR5X RAM, it’s a massive megahertz bump which will hopefully spur other manufacturers to do the same and mirror the performance increases of modern day CPUs. By comparison, last year’s 980 Ti card came in at 1000MHz with 6GB of standard DDR5. Most of the GTX 1080’s performance upgrade can be accredited to NVIDIA’s new Pascal architecture making its consumer hardware debut in this card. The big benefit with Pascal is its 16nm FinFET transistor technology, allowing NVIDIA cards utilizing the new style to reach higher clock speeds as well as make their cards much more power efficient.
Understanding language can be difficult from human to human, much less for machines attempting to understand the intricacies of human speech and text. Google knows this as well as anyone considering the countless queries made every hour which take a user where they need to go, even in the face of abhorrent sentence structure and unfortunate spelling mistakes. Today, Google is open-sourcing something called SyntaxNet and specifically a component for it, Parsey McParseface. With an endearing reference to the polled name Boaty McBoatface for NERC’s research vessel earlier this year, Google is releasing the tools it uses to understand natural language when typed into a box or interpreted via spoken word.
SyntaxNet is the overall framework for parsing sentences, called a “syntactic parser.” Mr. McParseface is the English language plugin for SyntaxNet. Google claims the plugin can identify objects, subjects, verbs, and other grammatical building blocks of sentences as well as, or even better than, trained human linguists.
As AI technology grows in popularity and complexity, the natural progression would then be to shrink the actual size this tool needs to function and that’s exactly what a Silicon Valley chip designer has done.
Chip designer Movidius has launched a USB stick with a supercomputer onboard, essentially taking the deep learning paradigm of artificial intelligence to the spacial efficiency of a USB drive. Deep learing involves training a computer which, as we saw with Google’s AlphaGo, can create a true force to be reckoned with. Dubbed, the “Fathom Neural Compute Stick”, the device has been designed to connect to existing systems and increase the performance of deep learning tasks by 20-30 times.
The chip maker is applying the leap to other endeavors such as helping drones avoid obstacles, and developing higher sensitivity thermal imaging. The company has also signed a deal with Google which may take their DeepMind project to new heights as well.
There have been many interesting hacks to increase the battery life of an iPhone since its release but hearing “light it on fire” has probably not been one of them. Lighting your phone on fire is still a terrible idea and should not be attempted to charge ANYTHING but a device called the FlameStower USB Fire Charger lets you harness humanities oldest discovery to power one of its newest inventions.
The FlameStower is useful when camping, and especially useful for those unfortunate times stranded on a desert island. It works by converting thermal energy to a generator which juices the phone. A blade protruding from the device is placed into the fire to get a generator going. From there, the attached phone will begin to receive power, albeit at a crawl. With the ability to charge small electronics, cameras, and flashlights as long as the device receiving charge is USB compatible.