This is TheTechStorm roundup of the some of the patents granted to Apple by the USPTO on October 30, 2012. The list include – User interface for controlling three-dimensional animation of an object and Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information.
User interface for controlling three-dimensional animation of an object
In the last few decades, computers and software have been used to animate objects. Initially, animation software was complicated and difficult to use. A user was generally required to interact with objects using a low level of abstraction. For example, a user would manually create different visual representations of an object (keyframes) and then use software to interpolate between them.
Recently, animation software has become more user-friendly, enabling a user to interact with objects at a higher level of abstraction. For example, a user may animate an object by applying a “behavior” to the object. A behavior is an animation abstraction and can be thought of as a macro, script, or plugin. When a behavior is applied to an object, the object is animated in a particular way (e.g., by growing or shrinking or by moving in a specific direction). Some examples of animation software that support behaviors are Anark Studio and Macromedia Director MX.
Although behaviors make it easier to animate objects, software that supports behaviors can still be difficult to use. Many types of behaviors may be applied to one object, and each type of behavior can be customized based on several parameters. Understanding each of these parameters and its effect on the behavior can be confusing. Providing values for all of these parameters can also be time-consuming.
What is needed is a better user interface for animating objects using behaviors.
Various embodiments of the invention cover various aspects of behaviors and working with behaviors. One embodiment covers behaviors themselves, including animations that can be produced by applying a behavior to an item and the algorithms underlying these animations. Another embodiment covers using behaviors in conjunction with keyframes. Yet another embodiment covers working with behaviors, including setting parameters of behaviors, saving behaviors, and creating new behaviors. Yet another embodiment covers objects to which behaviors may be applied, including, for example, images, text, particle systems, filters, generators, and other behaviors. Yet another embodiment covers dynamic rendering of objects to which behaviors have been applied, including changing an animation in real-time after the value of a behavior parameter has been changed. Yet another embodiment covers hardware acceleration methods that enable users to work effectively with behaviors.
A user can control the animation of an object via an interface that includes a control area and a user-manipulable control element. The control area includes an ellipse. The user-manipulable control element includes a three-dimensional arrow with a straight body, a three-dimensional arrow with a curved body, or a sphere. In one embodiment, the interface includes a virtual trackball that is used to manipulate the user-manipulable control element.
Touch screen device, method, and graphical user interface for providing maps, directions, and location-based information
A device, method, and graphical user interface for providing maps, directions, and location-based information on a touch screen display are disclosed.
Capacitive sensor coupling correction
Engineering change order language for modifying integrated circuit design files for programmable logic device implementation
Widget authoring and editing environment
An authoring environment for creating and/or editing user interface elements such as widgets used in a unified interest layer. The authoring environment facilitates creation of widgets that have a consistent appearance and mechanism, and allows third-party developers to easily create widgets that have a look and feel that is consistent with a predefined set of widgets.
Techniques for presenting sound effects on a portable media player
Improved techniques for presenting sound effects at a portable media device are disclosed. The sound effects can be output as audio sounds to an internal speaker, an external speaker, or both. In addition, the audio sounds for the sound effects can be output together with other audio sounds pertaining to media assets (e.g., audio tracks being played). In one embodiment, the sound effects can serve to provide auditory feedback to a user of the portable media device. A user interface can facilitate a user’s selection of sound effect usages, types or characteristics.
Seamless display migration
Exemplary embodiments of methods, apparatuses, and systems for seamlessly migrating a user visible display stream sent to a display device from one rendered display stream to another rendered display stream are described. For one embodiment, mirror video display streams are received from both a first graphics processing unit (GPU) and a second GPU, and the video display stream sent to a display device is switched from the video display stream from the first GPU to the video display stream from the second GPU, wherein the switching occurs during a blanking interval for the first GPU that overlaps with a blanking interval for the second GPU.
Hybrid inertial and touch sensing input device
Conventional input devices, such as computer mice, typically employ optical sensors, track wheels or track balls to control the motion of a cursor or other navigational object on a computer display screen. Other types of input devices that measure a force imparted onto the input device typically incorporate one or more accelerometers for sensing acceleration forces exerted on the input device as it is moved by a user. A velocity of the electronic input device may be calculated and estimated by integrating the measured acceleration over time, and a position estimate of the input device may be calculated by integrating its velocity over time. In this way, motion of an accelerometer-based input device may be translated to motion of a cursor or other navigational object on a computer display screen.
Touch-sensitive panels can also be used as an input device to control the motion of a cursor or other navigational object on a computer display screen. One common type of touch-sensitive panel is a touch pad. In general, touch-sensitive panels can detect one or more touch contacts on the surface of the touch-sensitive panel and generate signals indicative of the touch contacts. A computer can then control a cursor or other navigational object based on the detected touch contacts.
Various problems are associated with conventional input devices. For example, most, if not all, conventional input devices are inadequate in tracking both large and fine motions. For example, inertial sensing-based input devices typically track large ranges of motion well (e.g., moving a cursor across the length of a display screen), but not fine ranges of motions. In contrast, touch-sensitive pads typically track fine ranges of motions well, but not large ranges of motion. For example, moving a cursor from one end of the display screen to the other end may require a user to swipe his or her finger across a touch pad multiple times before the cursor moves to the other end of the display screen.
Embodiments of the present invention are directed to input devices using both inertial sensors and touch sensors. An exemplary input device has a motion sensing element capable of estimating a position of the input device based on a force applied to the input device. The motion sensing element can be used to track large ranges of motion. The input device can also include a touch sensitive surface operable to detect touches on the touch sensitive surface. The touch sensitive surface can be used to track relatively smaller ranges of motion.
Display screen portion with animated graphical user interface