LG patents flexible capacitive stylus that wraps around your arm, behaves like a smartwatch
Feb06

LG patents flexible capacitive stylus that wraps around your arm, behaves like a smartwatch

A new patent application from LG has surfaced today courtesy of the USPTO. In it, the Korean company describes a very interesting thing that’s basically a flexible capacitive stylus that can be worn on your arm like a watch – and can behave like a smartwatch too. This could be a very remarkable evolution of the Life Band Touch activity tracker wristband from LG that should become available in the coming weeks. Or it may just be a different product altogether, one sold in conjunction with the Life Band Touch. And as always when it comes to patent applications, there’s also the possibility that no actual product will ever result from this. It could just have been an interesting project for an R&D division, and nothing more. Let’s hope not, though. This has the potential of being quite unique, marrying the still rarely used capacitive stylus with some of the functions of a smart watch – and possibly even an activity tracker. It’s essentially a flexible wristband that has a capacitive stylus tip at one end. It wraps around your arm and you can wear it like a watch. In one described embodiment in the patent application, it also has a touchscreen which can be used to relay information from a connected mobile device (think smartphone or tablet) or can even control certain aspects of said paired device. This, in a way, sounds familiar – LG’s Life Band Touch can remote control a smartphone-based music player, after all. The thing basically has two modes of existence – as a stylus, when it’s straight, and as a wristband of sorts, after it’s been ‘deformed’. It can stay in either form until you change it into the other one. It even knows whether it’s straight or bent, and can communicate its state to a connected mobile device. Furthermore, a number of different sensors can be fitted to this stylus, allowing it to track one’s activity in a very similar way to the Life Band Touch wristband and other activity trackers. The stylus pen tip could be made detachable, allowing for easy replacement if needed. LG thinks that right now there aren’t more capacitive styluses around because they’re easy to lose and if they’re stored inside of mobile devices can make these more bulky than they need be. This invention is obviously the company’s solution to all of these issues – and a pretty innovative one. Whether or not it will catch on is another matter entirely,...

Read More
Apple is working on Optical Image Stabilization and improved Autofocus for iPhone camera
Jan10

Apple is working on Optical Image Stabilization and improved Autofocus for iPhone camera

Apple never cared much about megapixel count in iPhone cameras. They started with 2 megapixels in the original iPhone, when Nokia was already shipping smartphones with 5 megapixel modules. It took Apple 3 years to get to by then standard 5 megapixels in iPhone 4. And its latest iPhones are still stuck with 8 megapixel sensors, when every other flagship smartphone comes with 13 megapixels or more. Despite that, every iPhone review comes to the conclusion that Apple’s handsets are among the best camera phones out there, handily beating any rival Android, with only viable competition coming from Nokia.  Which proves that Apple was right to focus on better lenses, bigger pixels and sensors, advanced image processing algorithms, and not the pixel count. The only thing disappointing about 8 megapixel iPhone 5S  camera was the lack of optical image stabilization, which allows for much better low light performance and better videos. But the lack of OIS in iPhone 5S doesn’t mean that Apple is doesn’t care about it. In fact, we can now confirm that Apple is indeed working on optical image stabilization and improved autofocus system for iPhone cameras. Yesterday Apple’s patent application called “VCM OIS actuator module” was published on USPTO. It describes how Apple plans to go about adding OIS and improved autofocus in future iPhones. Apple will be using voice coil motor actuators to move the camera lens in various directions around the optical axis to provide both – better autofocus and OIS: Actuator module may have integrated therein a mechanism to provide the AF function and a mechanism to provide the OIS function. The AF mechanism is configured to both move the lens along the optical axis and actively tilt the lens. The lens tilt may be used to compensate for parasitic lens movements due to, for example, tilting of the device within which actuator module is implemented. The OIS mechanism is configured to move (e.g., shift) the lens in directions orthogonal to the optical axis to correct for handshake motions in the center of the image. By shifting, as opposed to tilting the entire camera (e.g., the lens and image sensor together as a rigid body), the associated image sensor substrate can remain stationary, substantially simplifying both camera manufacture, size and packaging in the mobile electronic device. It is unclear how far Apple has advanced with its OIS/AF system. But patent filing dates show that they have now been working on it at least since early 2012, and should be pretty far along towards a viable commercial product. Yesterday’s rumors about iPhone 6 camera with OIS point towards the same, as well. Source:...

Read More
Apple’s plans for Touch ID: trackpad for 5”+ iPhones, all display as fingerprint scanner & more in a patent app
Nov25

Apple’s plans for Touch ID: trackpad for 5”+ iPhones, all display as fingerprint scanner & more in a patent app

When iPhone 5S launched, the new word entered the discussion pretty soon. Many early users and reviewers agreed that the new Apple smartphone was “futureproof”. They were talking about the new 64-bit A7 CPU powering iPhone 5S, and M7 coprocessor that continuously tracks motion data. Value of both of those chips is pretty limited today, but they will enable a whole class of new apps and user experiences in the future. But A7 and M7 are not the only iPhone 5S parts that are future proof. Its new Touch ID fingerprint sensor, is just a biometric ID for now. It makes locking and unlocking your smartphone, and shopping on iTunes a tad easier. But  Apple has much bigger plans for Touch ID. The near term Apple Touch ID designs include transforming the fingerprint sensor under the Home button into a trackpad, adding new way to navigate and control iPhone UI. It should come in  very handy with the launch of iPhones with 5”+ sized displays next year. If you can control them by sliding your thumb around Home  button, just as well as swiping your finger around iPhone 4S display – your big iPhone 6 might be as easy to navigate around with a single hand. And there goes one of Apple’s main stated reasons for not building larger iPhones until now – one hand usability. That’s the near term plan. But Apple has even bigger designs for Touch ID in a more distant future. I may be oversimplifying a bit here, but Touch ID sensor in iPhone 5S is the same capacitive touchscreen sensor Apple uses for Multi-touch. Only with much higher (500 dpi) resolution. So enabling all iPhone display four touch ID is a matter of scaling up the technology Apple already made to work on a Home button. An iPhone or iPad with Touch ID display  will be able to recognize not just multi touch gestures, but also which finger from which hand you use to touch the screen. Which opens up a possibility to significantly upgrade Multi-touch UI with new gestures and user experiences. Apple’s plans for Touch ID sensor have been described in detail in a patent application called “Device, Method, and Graphical User Interface for Manipulating User Interfaces Based on Fingerprint Sensor Inputs”, that showed up in Wolrd Intellectual Property Organization database last week. Here are some of the ways Apple intends to use Touch ID, described in a patent app. iPhone Home button as an advanced trackpad The key to your Home button’s capability to act as a trackpad is the extreme sensitivity of a fingerprint sensor. At 500 dpi, the Touch ID sensor is able to read not just...

Read More
Google Translate app for Android tablets may offer conversations via split screen, two keyboards
Nov21

Google Translate app for Android tablets may offer conversations via split screen, two keyboards

Yesterday Google updated their Translate app for Android. They added simpler, more intuitive interface and handwriting recognition for more languages. The app update isn’t available in a Play Store here yet, so I couldn’t test it myself. But from the looks of it, the new UI will be a huge improvement over the old one, and should make conversation in language you don’t know much easier. And Google isn’t done with Translate improvements yet. Recently a patent application showed up in USPTO database, showing what might be in store for Translate app on Android tablets in the near future. If Google decides to make the UI described in this patent application a reality, Translate app will split Android tablet screen into two parts. One facing you, the other facing your partner, and will allow you to converse by typing phrases on both sides. I know that voice translation current version of Translate offers looks way more natural. But voice recognition, even by Google, still has ways to go before it becomes really trustworthy. This written split screen/double keyboard approach just looks so much more...

Read More
Samsung smartphone with flexible wraparound display, that actually has a good reason to be. Launch next year
Nov14

Samsung smartphone with flexible wraparound display, that actually has a good reason to be. Launch next year

The first smartphones with flexible displays – Samsung Galaxy Round and LG G Flex, are pretty pointless devices. Samsung and LG did them for bragging rights, and simply because they could. Not because there’s something those phones can do better than handsets with traditional, flat screens. But it does not mean that they won’t find ways to do useful stuff better with flexible OLEDs in the future. They certainly will. In fact, Samsung has already demoed one such approach almost a year ago, when they showed us a phone prototype with the screen bending on the side, creating an additional control/information bar. Today Bloomberg reports that Samsung plans to launch this phone sometime next year. And, by a strange coincidence, a patent application called “Method and apparatus for operating functions of portable terminal having bended display” was published by USPTO today. In it, Samsung  delves deeper into the cool things a phone with such wraparound display will allow us to do. Like: Slide to lock/unlock or battery/charge indicators on a side-screen that could be always on: Arranging and navigating through your photo gallery via folders or dates, and combination of touch and tilt gestures: Quick navigation menu for your address book, or chapter navigation for an e-book: Clipboard/storage area for items you find interesting and want to share: Display the size of attachments of your inbox messages: These things look much more interesting useful than anything Galaxy Round and G Flex can do today.  Of course, to make it work on a real device, Samsung has a lot of work ahead of them. But, hey, Samsung execs recently bragged that half of their R&D staff work in software development now. That’s about  30 000 people programming something 8 hours a day, 40 hours a week. Maybe with a device like this, they could finally create a good reason for Touchwiz to be something more than an annoying Samsung ego...

Read More
Could your iPhone send GPS data even when it’s off? Yep. Apple started toying with the idea last year
Nov13

Could your iPhone send GPS data even when it’s off? Yep. Apple started toying with the idea last year

Can your iPhone transmit your location data even if it is switched off? This not an idle question. Washington Post recently claimed that NSA has been collecting location data from powered down phones for almost a decade now. And when British watchdog Privacy International checked with the cellphone makers if such thing is possible, Nokia and Samsung kind of denied it, many others, including Apple, didn’t reply, and Ericsson said that it is possible if you’ll infect the phone with the right malware. So what about your iPhone? Well, it does not do anything like that today. But Apple started toying with the idea to enable its smartphones to send the location data even when they are powered down, last year. The technology and why Apple is thinking about it is described in their patent application called Apparatus and method for determining a wireless devices location after shutdown. Don’t worry, Tim Cook didn’t sell out to the NSA. There actually is a very good reason for your powered down iPhone to sometimes wake-up and check-in. And the reason is – possible loss or theft. Apple already provides iPhone locating service today. But it only works when your device is powered on. What if the thief switches it off as soon as he gets away? You are out of luck. So Apple had an idea. They designed a system that periodically and quietly powers up your switched off iPhone, and sends its location data  to servers in Cupertino. It does not always do that, and has to be actively enabled and configured by you. Also, you can disable the check-in system simply by entering a security code when you are switching off your device. If the code is correct – your iPhone stays off no matter what. But if it’s the wrong one, or someone doesn’t know they have to enter the code – periodic and silent wake-ups and check-ins start. Furthermore, your iPhone can quietly snap and send in a picture or two of its surroundings as well. Sounds like a pretty useful feature to me. Unfortunately, due to all this Snowden hoopla, it’ll probably remain in some unreleased version of iOS for...

Read More
Motorola patents skin tattoo throat mic for your smartphone. That can act as a… lie detector
Nov07

Motorola patents skin tattoo throat mic for your smartphone. That can act as a… lie detector

Remember the haptic tattoos that Nokia has been playing with to improve your smartphone experience? Well, Finns are not the only ones exploring tattoos for mobile. Motorola is looking into their own skin tattoo stickers that could interact with your Moto X or the latest Droid. Only instead of vibrations that inform you about the messages you receive on a phone, Motorola wants to use their stuff as a throat mics, to capture your voice and send it to the phone. The reason for doing this? Motorola says that such tattoo mic will be better able to hear what you are trying to say in noisy environments,  filter out an acoustic noise and transmit clear signal to your phone. Motorola’s tattoo sticker will come equipped with its own transceiver and antenna, microphone, signal processor, and even a small screen, for some reason. It’s not that you’ll be able to see what’s on that display when its stuck to your throat. Moto sticker tattoo will also have a power supply unit “configured to receive energizing signals from a personal area network”  associated with the smartphone you carry. And that’s not all. Motorola says that this tattoo mic might include: “… galvanic skin response detector to detect skin resistance of a user. It is contemplated that a user that may be nervous or engaging in speaking falsehoods may exhibit different galvanic skin response than a more confident, truth telling individual. “ Why would you want your own smartphone accessory to act as a lie detector on yourself, I have no idea. Though that screen I was talking about before? Now that would come in handy for this particular feature, flashing red for all the world to see, that you are lying to your girlfriend on the other end of the line. Nokia’s haptic tattoos, while pretty far out, at least did make some sense. Tattoo sticker for a throat mic? That’s actually cool. But all that additional stuff Motorola R&D types are talking about in their patent application?  Now that, is just plain weird. Source:...

Read More
Google Project Loon: 10 Mbps for users, 50 Gbps ultra-bright LED links between backbone super-nodes 100 miles apart
Aug14

Google Project Loon: 10 Mbps for users, 50 Gbps ultra-bright LED links between backbone super-nodes 100 miles apart

So far Google has been describing their Project Loon balloon internet initiative in very general terms. Yes, we know that the idea is to provide internet access via a network of balloons floating in stratosphere 20 kilometers high, and use rather stable atmospheric winds at different altitudes up there to keep balloons on station by moving them up or down. But that’s about all that Google told us about Project Loon. How exactly those balloons keep their station and move up or down? How big the Loon network can be? How do they communicate with each other, with ground stations and internet access points on the ground? What kind of speeds can we expect from it? We had no idea. Until now. Recently several patent applications related to Project Loon became public in USPTO database, with a lot of details about Google’s balloon Internet. Including an expected 10 Mbps speeds for users on the ground, and 50 Gbps backbone network of Super Balloons 100 miles apart, covering huge areas via ultra-bright LED free-air optical links. The balloon network with 50 Gbps super-node backbone and 10 Mbps downlinks In the patent apps Google talks about various kinds of possible network configurations. But the most interesting and ambitious one is this: The high-altitude balloon network consists of two types of balloons: The backbone of super-nodes that use ultra-bright LEDs to talk to each other via free air optical links, over the distances of up to 100 miles. According to Google,  such network of super-nodes can achieve data transmission rates of 10 to 50 Gigabits per second A number of sub-node balloons that connect to super-nodes and to the access points on the ground, providing ordinary users with 10 Mbps wireless Internet connections Super-nodes may also talk to each other using lasers, but that may be problematic due to various regulations regarding laser comms. Super and sub-node balloons form balloon clusters (BC) over certain defined geographic area. Sub-nodes can move between nearby clusters and their density may be adjusted according to the data throughput requirements. E.g. a cluster with a lot of sub-nodes may be needed above the city, while  rural areas may make do with one super-node a few sub-nodes. The network is also able to temporarily create and move balloon clusters as needed, for example for an event like rock festival, or disaster relief effort. There is also one more type of balloon – that stays more or less above the project  ground stations and connects the network to them via optical or high throughput radio link. The balloons Balloons move around and keep station above certain area by using relatively constant different direction winds in stratosphere. They...

Read More
Apple is working on Smart Stay like gaze detection feature for iPhone. May come in iOS 7. Or later
May30

Apple is working on Smart Stay like gaze detection feature for iPhone. May come in iOS 7. Or later

Samsung has one cool feature on its last year’s flagship – Galaxy S3 . Its called Smart Stay, and determines whether you are looking at the phone screen, then prevents it from auto-dimming the display until you move your eyes off it. When it works, it is a pretty useful thing when you take your time to read some web article on the huge SGS3 display, and don’t want to constantly slide you finger on it, just to keep the display from turning off. Well, Apple has been thinking about adding such gaze detection feature to the iPhone for several years now. And even has a patent application, called “Electronic devices with gaze detection capabilities”, to prove it. Apple’s gaze detection tech works by utilizing accelerometer in combination with the front facing cam. Accelerometer helps to figure out when you are holding your iPhone still, like when you read something on it. Then the face detection software chips in, and figures out whether you are looking at the phone display or not. If you do, the switch preventing screen from dimming turns on. When you move your eyes away, or move the phone away, your iPhone turns the normal energy saving mode on, and shuts off the display after few seconds. The patent application above was filed on January 25th, 2013. Which goes to show that Apple may be actively developing gaze detection feature to include in iOS7. Or maybe not. This patent app is just a new version of the one Apple filed way back in 2008, and we haven’t seen a trace of gaze detection in any iPhone that came out between then and now. I don’t know why Apple didn’t include Smart Stay like thingie  in older iPhones. Maybe they saw no point for it on small 3.5” screens. But iPhone 5 screen is already bigger, so you can spend longer times reading before you have to scroll. Maybe its time for iOS 7 to finally get gaze detection this...

Read More
Google plans to include haptic/tactile feedback engine into Android to… make their ads more effective
May23

Google plans to include haptic/tactile feedback engine into Android to… make their ads more effective

Looking from the business model point of view, Google is an advertising company. All their apps and services, almost all the technologies they are working on, are geared for one purpose only – get as much pageviews as possible to display ads on, and get as much information about you, to show the most actionable ads possible. Today USPTO has published a patent application called “ Providing information through tactile feedback” that makes Google’s obsessive advertising priorities even more clear. The patent app deals with the issue of the tactile/haptic feedback on touchscreen devices. The lack of such tactile feedback from our full touch smartphones and tablets, is probably one of the most annoying things about them. Apple, Nokia, Samsung and other companies in the smartphone biz have been trying to solve the touch device tactile feedback problem for years. But, except for the rather primitive haptic engines that vibrate most of the device on touch event, outside an R&D lab, none of them has anything interesting to show us. But they will figure something out eventually. And this brings us to Google, and their latest patent app. In it, Google does not go deep into the technologies of how to make the tactile/haptic feedback work. They do an overview of what might be possible, and how it might be done, but it is very broad and not really helpful. For all intents and purposes, Google just says that eventually the solution will be there. That devices with a very granular tactile feedback, along full surface of a touchscreen, will come along. And then Google will be ready with their own “Tactile Interface Engine” inside Android, to …. display all those ads on Android smartphone,  with tactile feedback – to make them better and more effective. The ways to do so, described in the patent app, include: Separating an ad displayed on a webpage with a border having a distinct tactile feel, so you notice when your finger is approaching the ad. Displaying multiple ads in separate tabs, and giving each ad different textures/feelings of roughness. The ad which is considered the most relevant will give the roughest feedback, the next one will give a different and lighter sensation, and so on. If you decide to drag one of the background ads to a new place, the display will provide you with the physical feeling of dragging something, to keep you more focused. And those different textures of roughness? Well, they can be adjusted to the ad content, too: if the content displayed in the first region 208 corresponds to an ad for a beach resort, the haptic feedback provided...

Read More