Sunday, June 27, 2010

Bluetooth 4.0

Bluetooth low energy and its predecessors (think Wibree) have been in the pipe for ages now, but we might actually see this tech take off en masse for the first time now that the Bluetooth SIG has officially added it into a release: 4.0. While Bluetooth 3.0 was all about high energy with the introduction of WiFi transfer, 4.0 takes things down a notch by certifying single-mode low energy devices in addition to dual-mode devices that incorporate both the low energy side of the spec plus either 2.1+EDR or 3.0. In a nutshell, the technology should bring a number of new categories and form factors of wireless devices into the fold since 1Mbps Bluetooth low energy can operate on coin cells -- the kinds you find in wristwatches, calculators, and remote controls -- and the SIG's pulling no punches by saying that "with today's announcement the race is on for product designers to be the first to market." Nokia pioneered Wibree, so you can bet they'll be among the frontrunners -- bring it, guys.

"Bluetooth v4.0 throws open the doors to a host of new markets for Bluetooth manufacturers and products such as watches, remote controls, and a variety of medical and in-home sensors," said Bluetooth SIG executive director Michael Foley. "Many of these products run on button-cell batteries that must last for years versus hours and will also benefit from the longer range enabled by this new version of the Bluetooth specification. "
When talking about "longer range" Foley is referencing the fact Bluetooth 4.0 will also be capable of integrating with WiFi signals just like v3.0 + High Speed. SIG now refers to this functionality as 'Classic Bluetooth', far less of a mouthful.
So Bluetooth continues to evolve and defy critics who claimed the standard would die off many years ago. On the other hand, whether it will be more widely adopted than Bluetooth 3.0 + High Speed remains to be seen. Consumer devices featuring Bluetooth 4.0 will launch in between late 2010 and early 2011, just so long as companies care to implement it.

Intel Core i7

Intel Core i7 is an Intel brand name for several families of desktop and laptop 64-bit x86-64 processors using the Nehalem microarchitecture that are marketed for the business and high-end consumer markets. The "Core i7" brand is intended to differentiate these processors from Core i5 processors intended for the main-stream consumer market and Core i3 processors intended for the entry-level consumer market.
"Core i7" is a successor to the Intel Core 2 brand. The Core i7 identifier was first applied to the initial family of processors codenamed Bloomfield introduced in 2008. In 2009 the name was applied to Lynnfield and Clarksfield models. Prior to 2010, all models were quad-core processors. In 2010, the name was applied to dual-core Arrandale models, and the Gulftown Core i7-980X Extreme processor which has six hyperthreaded cores.
Intel representatives state that the moniker Core i7 is meant to help consumers decide which processor to purchase as the newer Nehalem-based products are released in the future. The name continues the use of the Intel Core brand. Core i7, first assembled in Costa Rica, was officially launched on November 17, 2008 and is manufactured in Arizona, New Mexico and Oregon, though the Oregon (PTD, Fab D1D) plant has already moved to the next generation 32 nm process.

Intel Atom Processor


Intel Atom is a direct successor of the Intel A100 and A110 low-power microprocessors (code-named Stealey), which were built on a 90 nm process, had 512 KB L2 cache and run at 600 MHz/800 MHz with 3W TDP (Thermal Design Power). Prior to the Silverthorne announcement, outside sources had speculated that Atom would compete with AMD's Geode system-on-a-chip processors, used by the One Laptop per Child project, and other cost- and power-sensitive applications for x86 processors. However, Intel revealed on October 15, 2007 that it was developing another new mobile processor, codenamed Diamondville, for OLPC-type devices.
"Atom" was the name under which Silverthorne would be sold, while the supporting chipset formerly code-named Menlow was called Centrino Atom. Intel's initial Atom press release only briefly discussed "Diamondville" and implied that it too would be named "Atom", strengthening speculation that Diamondville is simply a lower-cost, higher-yielding version of Silverthorne with slightly higher TDPs at slightly lower clock speeds.
At Spring Intel Developer Forum (IDF) 2008 in Shanghai, Intel officially announced that Silverthorne and Diamondville are based on the same microarchitecture. Silverthorne would be called the Atom Z series and Diamondville would be called the Atom N series. The more expensive lower-power Silverthorne parts will be used in Intel Mobile Internet Devices (MIDs) whereas Diamondville will be used in low-cost desktop and notebooks. Several Mini-ITX motherboard samples have also been revealed. Intel and Lenovo also jointly announced an Atom powered MID called the IdeaPad U8. The IdeaPad U8 weighs 280 g and has a 4.8 in (12 cm) touchscreen providing better portability than a netbook PC and easier Internet viewing than a mobile phone or PDA.
In April 2008, a MID development kit was announced by Sophia Systems and the first board called CoreExpress-ECO was revealed by a German company LiPPERT Embedded Computers, GmbH. Intel offers Atom based motherboards.
Intel Atom is the brand name for a line of ultra-low-voltage x86 and x86-64 CPUs (or microprocessors) from Intel, designed in 45 nm CMOS and used mainly in netbooks, nettops, and Mobile Internet devices (MIDs). On December 21, 2009 Intel announced the next generation of Atom processors, including the N450, with total kit power consumption down 40%

Nvidia Physx


PhysX is a proprietary realtime physics engine middleware SDK acquired by Ageia (which itself was acquired by Nvidia in February 2008) with the purchase of ETH Zurich spin-off NovodeX in 2004. The term PhysX can also refer to the PPU add-in card designed by Ageia to accelerate PhysX-enabled video games. Video games supporting hardware acceleration by PhysX can be accelerated by either a PhysX PPU or a CUDA-enabled GeForce GPU (which has at least 32 CUDA cores), thus offloading physics calculations from the CPU, allowing it to perform other tasks instead — resulting in a smoother gaming experience and additional visual effects.
Middleware physics engines allow game developers to avoid writing their own code to handle the complex physics interactions possible in modern games.
The PhysX engine and SDK are available for the following platforms:
Apple Mac OS X
Windows
Linux (32-bit)
Nintendo Wii
Sony PlayStation 3
Microsoft Xbox 360
Nvidia provides both the engine and SDK for free to Windows and Linux users and developers.The PlayStation 3 SDK is also freely available due to Sony's blanket purchase agreement.

A physics processing unit (PPU) is a dedicated microprocessor designed to handle the calculations of physics, especially in the physics engine of video games. Examples of calculations involving a PPU might include rigid body dynamics, soft body dynamics, collision detection, fluid dynamics, hair and clothing simulation, finite element analysis, and fracturing of objects. The idea is that specialized processors offload time consuming tasks from a computer's CPU, much like how a GPU performs graphics operations in the main CPU's place.
The first PPUs were the SPARTA and HELLAS.
The term was coined by Ageia's marketing to describe their PhysX chip to consumers. Several other technologies in the CPU-GPU spectrum have some features in common with it, although Ageia's solution is the only complete one designed, marketed, supported, and placed within a system exclusively as a PPU.

Thursday, June 24, 2010

ANIMATRONICS

Animatronics is a cross between animation and electronics. Basically, an animatronic is a mechanized puppet. It may be preprogrammed or remotely controlled. An abbreviated term originally coined by Walt Disney as “Audio-Animatronics” (used to describe his mechanized characters), can actually be seen in various forms as far back as Leonardo-Da-Vinci’s Automata Lion, (theoretically built to present lillies to the King of France during one of his Visits),and has now developed as a career which may require combined talent in Mechanical Engineering , Sculpting / Casting, Control Technologies, Electrical / Electronic, Airbrushing, Radio-Control.
The subject of animatronics, emotional display and recognition has evolved into a major industry and has become more efficient through new technologies. Animatronics is constantly changing due to rapid advancements and trends that are taking place in hardware and software section of the industry. The purpose of this research was to design and build an animatronics robot that will enable students to investigate current trends in robotics. This paper seeks to highlight the debate and discussion concerning engineering challenge that mainly involved secondary level students.
This paper explores the hardware and software design of animatronics and emotional face displays of robots. Design experience included artistic design of the robot, selection of actuators, mechanical design, and programming of the animatronics robot. Students were challenged to develop models with the purpose of creating interest in learning Science, Technology, Engineering, and Mathematics.
It is possible for us to build our own animatronics by making use of ready-made animatronic kits provided by companies such as Mister Computers where no programming skills are required. Only knowledge of Windows is required.
Animatronics was developed by Walt Disney in the early sixties. Essentially, an animatronic puppet is a figure that is animated by means of electromechanical devices . Early examples were found at the 1964 World Fair in the New York Hall of Presidents and Disney Land. In the Hall of Presidents, Lincoln, with all the gestures of a statesman, gave the Gettysburg’s address. Body language and facial motions were matched to perfection with the recorded speech. Animatronics was a popular way of entertainment that had proven itself in the theme parks and cinematography industry
Animatronics is a subset of anthropomorphic robots which are designed drawing inspiration from nature. The most recent advancement in building an anthropomorphic robot is Kismet (earlier developed by MIT), that engages people in expressive face-to-face interaction. Inspired by infant social development, psychology, ethology, and evolutionary perspective, this work integrates theories and concepts from these diverse scientific viewpoints to enable Kismet to enter into natural and intuitive social interaction with a person, reminiscent of adult infant exchanges. Kismet perceives a variety of natural social cues from visual and auditory channels, and delivers social signals to people through gaze direction, facial expression, body posture, and vocalization.
There is a great deal of research around the world recently in Japan on developing of interactive robots with a human face. Development of interactive human like robot brings this research to the frontiers of artificial intelligence, materials, robotics, and psychology. Machines displaying emotions is a relatively new endeavor that goes far back to earlier times. The entertainment field is also overlapping new research on androids; the term android is derived from fiction relating to a complete mechanical automation.
An extension of the engineering challenge is to explore the effectiveness of the project’s
capability to display human emotions, and to design the physical mechanisms that display realistic human facial movements. The objective of this effort was to design and build an animatronic robot SSU-1 (Savannah State University-1). The SSU-1 will be controlled by a preprogrammed embedded microcontroller and will create human like motions for entertainment purposes.


Tuesday, June 22, 2010

Andriod OS

Android is Google's operating system for mobile devices. It is a competitor to Apple's iOS for the iPhone.
Technologically, Android includes middleware and key applications, and uses a modified version of the Linux kernel. It was initially developed by Android Inc., a firm later purchased by Google, and lately by the Open Handset Alliance. It allows developers to write managed code in the Java language, controlling the device via Google-developed Java libraries.
The Android operating system software stack consists of Java applications running on a Java based object oriented application framework on top of Java core libraries running on a Dalvik virtual machine featuring JIT compilation. Libraries written in C include the surface manager, OpenCore media framework, SQLite relational database management system, OpenGL ES 2.0 3D graphics API, WebKit layout engine, SGL graphics engine, SSL, and Bionic libc. The Android operating system consists of 12 million lines of code including 3 million lines of XML, 2.8 million lines of C, and 2.1 million lines of Java.
The unveiling of the Android distribution on 5 November 2007 was announced with the founding of the Open Handset Alliance, a consortium of 71 hardware, software, and telecom companies devoted to advancing open standards for mobile devices. Google released most of the Android code under the Apache License, a free software and open source license.
According to NPD Group, unit sales for Android OS smartphones ranked second among all smartphone OS handsets sold in the U.S. in the first quarter of 2010. BlackBerry OS and iOS ranked first and third respectively.

Sunday, June 20, 2010

OPTICAL COMPUTING

























With the growth of computing technology the need of high performance computers (HPC) has significantly increased. Optics has been used in computing for a number of years but the main emphasis has been and continues to be to link portions of computers, for communications, or more intrinsically in devices that have some optical application or component (optical pattern recognition etc.)

Optical computing was a hot research area in 1980’s.But the work tapered off due to materials limitations that prevented optochips from getting small enough and cheap enough beyond laboratory curiosities. Now, optical computers are back with advances in self-assembled conducting organic polymers that promise super-tiny of all optical chips.

Optical computing technology is, in general, developing in two directions. One approach is to build computers that have the same architecture as present day computers but using optics that is Electro optical hybrids. Another approach is to generate a completely new kind of computer, which can perform all functional operations in optical mode. In recent years, a number of devices that can ultimately lead us to real optical computers have already been manufactured. These include optical logic gates, optical switches, optical interconnections and optical memory.

Current trends in optical computing emphasize communications, for example the use of free space optical interconnects as a potential solution to remove ‘Bottlenecks’ experienced in electronic architectures. Optical technology is one of the most promising, and may eventually lead to new computing applications as a consequence of faster processing speed, as well as better connectivity and higher bandwidth.

2. NEED FOR OPTICAL COMPUTING

The pressing need for optical technology stems from the fact that today’s computers are limited by the time response of electronic circuits. A solid transmission medium limits both the speed and volume of signals, as well as building up heat that damages components.

One of the theoretical limits on how fast a computer can function is given by Einstein’s principle that signal cannot propagate faster than speed of light. So to make computers faster, their components must be smaller and there by decrease the distance between them. This has resulted in the development of very large scale integration (VLSI) technology, with smaller device dimensions and greater complexity. The smallest dimensions of VLSI nowadays are about 0.08mm. Despite the incredible progress in the development and refinement of the basic technologies over the past decade, there is growing concern that these technologies may not be capable of solving the computing problems of even the current millennium. The speed of computers was achieved by miniaturizing electronic components to a very small micron-size scale, but they are limited not only by the speed of electrons in matter but also by the increasing density of interconnections necessary to link the electronic gates on microchips.

The optical computer comes as a solution of miniaturizing problem. Optical data processing can perform several operations in parallel much faster and easier than electrons. This parallelism helps in staggering computational power. For example a calculation that takes a conventional electronic computer more than 11 years to complete could be performed by an optical computer in a single hour. Any way we can realize that in an optical computer, electrons are replaced by photons, the subatomic bits of electromagnetic radiation that make up light.

3. SOME KEY OPTICAL COMPONENTS FOR COMPUTING

The major breakthroughs on optical computing have been centered on the development of micro-optic devices for data input.

VCSEL (VERTICAL CAVITY SURFACE EMITTING LACER)

VCSEL (pronounced ‘vixel’) is a semiconductor vertical cavity surface emitting laser diode that emits light in a cylindrical beam vertically from the surface of a fabricated wafer, and offers significant advantages when compared to the edge-emitting lasers currently used in the majority of fiber optic communications devices. The principle involved in the operation of a VCSEL is very similar to those of regular lasers.

There are two special semiconductor materials sandwiching an active layer where all the action takes place. But rather than reflective ends, in a VCSEL there are several layers of partially reflective mirrors above and below the active layer. Layers of semiconductors with differing compositions create these mirrors, and each mirror reflects a narrow range of wavelengths back in to the cavity in order to cause light emission at just one wavelength.