Fable developers, Telltale Games have released a short teaser trailer for their...
Ignoring the absurdly terrible name, a patent filed by Sony Computer Entertainment Europe back on July 23rd 2012 has appeared online (cheers Joe) and it details plans for a new tracking device controller than can interact via touch and pick up fully three-dimensional space – at least within an allotted area.
The patent itself seems to be rather legitimate as the inventors listed are prolific in inventing and patenting projects on behalf of SCEE, and the technology on show in the patent isn’t completely implausible either.
So, what’s the piece of tech Sony are proposing? Something called the EyePad – a poor name choice there if you ask us.
The EyePad is pitched as a new peripheral that features both physical buttons and a touch pad in the centre – so far, so PlayStation Vita right?
The EyePad will be larger in size and, judging by the picture on show, it’ll be far more square in shape and won’t feature dual analogue sticks. It’s also described as being a “typically a panel, tablet or slab-like device.”
The interesting part are the embedded cameras around the touchpad’s edge that work together to build up a three-dimensional picture that can then be relayed back to the PlayStation for it to form a three-dimensional virtual object and pinpoint exactly the object’s position, size and shape. It’ll be able to see exactly what’s going on in low-light areas due to a set of illuminating strips around the edge of the touch pad or camera bezels that allow the PlayStation Eye to pick up the device a la PlayStation Move.
“The illuminated region(s) provide a similar functionality to that of the illuminated ball of the Playstation move, allowing the EyePad to be detected within an image captured by the EyeToy,” reads point 0031 of the patent. “The position of the illuminated region(s) within the captured image provides x, y co-ordinates for the EyePad, whilst the size of the illuminated region(s) within the captured image can be used to calculate a z co-ordinate for the Eyepad, based upon a reference size and distance relationship for images of the Eyepad captured by the EyeToy.
“In addition, the angle(s) of the illuminated region (s) within the captured images can provide additional information about the orientation of the EyePad.”
The patent then goes on to talk about the position of the EyePad’s camera, what they do and device’s possible aspect ratios.
“The stereoscopic video cameras 1030A(L,R) and 1030B(L,R) are located at opposite corners of the touchpad 1020 (or of an equivalent surface area where a touchpad is not provided). In Figure 2 the left and right cameras of each stereoscopic camera is shown sharing a single protective bezel.
“In one embodiment of the present invention, the aspect ratio of the touchpad is 1:1 (i.e. square). In other embodiments, it may match the aspect ratio of a TV, such as 16:9. For simplicity of explanation and illustration, the description below assumes a square touchpad as a nonlimiting example.
“Cameras are positioned (with suitable lenses) a small distance apart (for example in the range 0.1 to 10 cm) substantially adjacent to two opposing corners of the touchpad,” continues the patent. “As noted above the cameras may share a protective bezel. In an alternative embodiment of the present invention, the cameras are located within the body of the EyePad (for example respectively in the side sections of the device 1040L and 1040R) and receive a view from the corners of the touchpad.
“In this case, optionally only one camera is used in each corner, but receives a split view from two prisms or fibre-optic arrangements and hence captures a stereoscopic pair of images simultaneously as two halves of a single image.”
By placing the cameras in these locations, the EyePad could then capture information from all sides to see exactly what’s going on and deliver the correct information to the PlayStation.
“The stereoscopic cameras can therefore provide stereo images for complementary sides of the common volume, from each of which so-called disparity maps or depth maps can be generated.”
“Consequently if an object (such as a user’s hand or an ornament) is placed within the common volume, complementary images and depth maps for both sides of the object can be obtained.
“Combining depth maps from two known positions on either side of an object provides a representative estimate of the 3D shape of the object, in an analogous fashion to two halves of a mould being combined to define the shape of a complete object.”
“In an embodiment of the present invention, the captured stereoscopic images are processed to generate two depth maps and two corresponding video images. In an embodiment of the present invention, the processing is performed by a processor 1100 on the EyePad, and the resulting depth maps and video images are transmitted to the PS3 via Bluetooth.”
This isn’t all this supposed EyePad can do, it makes use of the DualShock 3 Sixaxis technology to help the PlayStation determine exactly how the pad is oriented and what’s being done with it.
“In addition the EyePad can comprise similar motion sensors 1050 and Bluetooth or similar transmitters 1740 to those found within the SIXAXIS controller.
“Hence in a similar manner to the Move controller, the position and orientation of the EyePad can be determined to a high degree of accuracy using a combination of cues from the captured image and also data regarding the relative translation and rotation detected by the EyePad and transmitted from the EyePad to the PS3.”
SCEE seem to see the EyePad as an “augmented reality marker,” suggesting that the company plans to move closer to the realm of creating virtual worlds for us to directly interact with i.e. EyePet, Wonderbook etc, an idea that SCEE have definitely already seen potential in.
“For example, the EyePad may be used as a grooming table for an EyePet ®; in the EyePet games,” reads point 0053, “video captured by the EyeToy video camera is displayed
on a television in conjunction with a virtual pet character; movements of the user and the use of AR markers enables the user to interact with the pet.
“Hence for example, the user moves the EyePad downwards so that its absolute position in the captured EyeToy image is below a threshold level, prompting the EyePet to jump on to the EyePad, with the positioning of the EyePet being consistent with both the position and orientation of the EyePad. The user may then carry the EyePad and the superposed EyePet around their room, with the EyePet being scaled and rotated accordingly to appear to remain sat upon the EyePad.”
There are also possibilites to scan real world objects to be used in-game, or even capturing your own face accurately for it to be replicated on an in-game avatar.
“the user can place an object on the EyePad, and see it copied into the virtual world of the
game,” reads point 0060. “In a similar manner, the user can put their face within the common volume in order to import their own face onto an in-game character or other avatar.”
As you can see, there’s immense potential for this proposed EyePad. The document itself goes on to detail a few more uses and then the technology behind how it exchanges information and renders images, along with the utilisation of the touch pad and built in speaker. Especially the fact that they may replace the touch pad with a touch screen to utilise a second screen experience.
Could we expect to hear something about it at Sony’s event on February 20th?
Naturally we reached out to SCEE to ask about the patent, we’re still waiting on a response so we’ll update you with information when we get it.
UPDATE: Sony declined to comment, saying “we never comment on patents.” So looks like we’re no closer to knowing what it could be.