Tampilkan postingan dengan label Android Auto. Tampilkan semua postingan
Tampilkan postingan dengan label Android Auto. Tampilkan semua postingan

Selasa, 27 Oktober 2015

Introducing a New Course on Developing Android Apps for Auto

Posted by Wayne Piekarski, Developer Advocate



Android Auto brings the Android platform to the car in a way that’s optimized for the driving experience, allowing the user to keep their hands on the wheel, and their eyes on the road. To learn how to extend your existing media and messaging apps to work within a car, we collaborated with Udacity to introduce a new course on Ubiquitous Computing with Android Auto.






Designed by Developer Advocates from Google, the course shows you how to take advantage of your existing Android knowledge to work on this new platform. The best part is that Android Auto is based on extensions to the regular Android framework, so you don't need to rewrite your existing apps to support it. You'll learn how to implement messaging apps, by using Notification extensions. You'll also learn how audio players just work on Android Auto when you use the Android media APIs. In both cases, we work through some simple Android samples, and then show what changes are needed to extend them for Android Auto. Finally, we show a complete music playing sample, and how it works across other platforms like Android Wear.



If you have an interest in Android-based messaging or media apps, then you need to learn about Android Auto. Users want to be able to take their experience to other places, such as their cars, and not just on their phones. Having Auto support will allow you to differentiate your app, and give users another reason to try it.



This class is part of our larger series on Ubiquitous Computing across Google platforms, such as Android Wear, Android Auto, Android TV, and Google Cast. Designed as short, standalone courses, you can take any course on its own, or take them all! The Android Auto platform is a great opportunity to add functionality that will distinguish your app from others. This Udacity course will get you up to speed quickly with everything you need to get started.



Get started now and try it out at no cost, your users are waiting!



Kamis, 27 Agustus 2015

Announcing the Android Auto Desktop Head Unit

Posted by Josh Gordon, Developer Advocate


Today we’re releasing the Desktop Head Unit (DHU), a new testing tool for Android Auto developers. The DHU enables your workstation to act as an Android Auto head unit that emulates the in-car experience for testing purposes. Once you’ve installed the DHU, you can test your Android Auto apps by connecting your phone and workstation via USB. Your phone will behave as if it’s connected to a car. Your app is displayed on the workstation, the same as it’s displayed on a car.











The DHU runs on your workstation. Your phone runs the Android Auto companion app.



Now you can test pre-released versions of your app in a production-like environment, without having to work from your car. With the release of the DHU, the previous simulators are deprecated, but will be supported for a short period prior to being officially removed.


Getting started


You’ll need an Android phone running Lollipop or higher, with the Android Auto companion app installed. Compile your Auto app and install it on your phone.


Install the DHU


Install the DHU on your workstation by opening the SDK Manager and downloading it from Extras > Android Auto Desktop Head Unit emulator. The DHU will be installed in the <sdk>/extras/google/auto/ directory.


Running the DHU



Be sure your phone and workstation are connected via USB.


  1. Enable Android Auto developer mode by starting the Android Auto companion app and tapping on the header image 10 times. This is a one-time step.



  2. Start the head unit server in the companion app by clicking on the context menu, and selecting “Start head unit server”. This option only appears after developer mode is enabled. A notification appears to show the server is running.











  3. Start the head unit server in the Android Auto companion app before starting the DHU on your workstation. You’ll see a notification when the head unit server is running.


  4. On your workstation, set up port forwarding using ADB to allow the DHU to connect to the head unit server running on your phone. Open a terminal and type adb forward tcp:5277 tcp:5277. Don’t forget this step!



  5. Start the DHU.

      cd <sdk>/extras/google/auto/

      On Linux or OSX: ./desktop-head-unit

      On Windows, desktop-head-unit.exe



At this point the DHU will launch on your workstation, and your phone will enter Android Auto mode. Check out the developer guide for more info. We hope you enjoy using the DHU!


Join the discussion on



+Android Developers

Rabu, 24 Juni 2015

Android Developer Story: Shifty Jelly drives double-digit growth with material design and expansion to the car and wearables

Posted by Lily Sheringham, Google Play team



Pocket Casts is a leading podcasting app on Google Play built by Australian-based mobile development company Shifty Jelly. The company recently achieved $1 million in sales for the first time, reaching more than 500K users.



According to the co-founder Russell Ivanovic, the adoption of material design played a significant role in driving user engagement for Pocket Casts by streamlining the user experience. Moreover, users are now able to access the app beyond the smartphone -- in the car with Android Auto, on a watch with Android Wear or on the TV with Google Cast. The rapid innovation of Android features helped Pocket Casts increase sales by 30 percent.



We chatted with co-founders and Android developers Russell and Philip Simpson to learn more about how they are growing their business with Android.





Here are some of the features Pocket Casts used:


  • Material Design: Learn more about material design and how it helps you create beautiful, engaging apps.

  • Android Wear: Extend your app to Android Wear devices with enhanced notifications or a standalone wearable app.

  • Android Auto: Extend your app to an interface that’s optimized for driving with Android Auto.

  • Google Cast: let your users cast your app’s content to Google Cast devices like Chromecast, Android TV, and speakers with Google Cast built-in.


And check out the Pocket Casts app on Google Play!



Jumat, 03 April 2015

Enable your messaging app for Android Auto

Posted by Joshua Gordon, Developer Advocate



What if there was a way for drivers to stay connected using your messaging app, while keeping their hands on the wheel and eyes on the road?



Android Auto helps drivers stay connected, but in a more convenient way that's integrated with the car. It eliminates the need to type and read messages by replacing these activities with a voice controlled interface.



Enabling your messaging app to work with Android Auto is easy. Developers like Skype and textPlus have already done so. Check out this DevByte for an overview of the messaging APIs, and see the developer training guide for a deep dive. Read on for a look at the key steps involved.






Message notifications on the car’s display



When an Android 5.0+ phone is connected to a compatible car, users receive incoming message notifications from Auto-enabled apps on the car’s head unit display. Your app runs on the phone, but is controlled by the car. To learn more about how this works, watch the Introduction to Android Auto DevByte.





A new message notification from Skype



If your app already uses notifications to alert the user to incoming messages, it’ll be easy to extend these for Auto. It takes just a few lines of code, and you won’t have to change how your app works on the phone.



There are a couple small differences between message notifications on Auto vs. a phone. On Auto, a preview of the message content isn’t shown, because messaging is driven entirely by voice. Second, message notifications are backed by a conversation object. This is simply a collection of unread messages from a particular sender.



Decorate your notification with the CarExtender to add support for the car. Next, use the UnreadConversation.Builder to create a conversation, and populate it by iterating over your app's unread messages (from a certain sender) and adding them to the conversation. Pass your conversation object to the CarExtender, and you’re done!



Tap to hear messages



Tapping on a message notification plays it back on the car's sound system, via text to speech. This is handled automatically by the framework; no additional code is required. Pretty cool, right?



In order to know when the user hears a message, you provide a PendingIntent that’s triggered by the system. That’s one of just two intents you’ll need to handle to enable your app for Auto.



Reply by voice



Voice control is the real magic of Android Auto. Users reply to messages by speaking, via voice recognition. This is far faster and more natural than typing.



Enabling this functionality is as simple as adding a RemoteInput instance to your conversation objects, before you issue the notification. Speech recognition is handled entirely by the framework. The recognition result is delivered to your app as a plain text string via a second PendingIntent.




Replying to a message from textPlus by voice.



Next Steps



Make your messaging app more natural to use in the car by enabling it for Android Auto. Now drivers can stay connected, without typing or reading messages. It just takes a few lines of code. To learn more visit developer.android.com/auto

Rabu, 25 Maret 2015

Developing audio apps for Android Auto

Posted by Joshua Gordon, Developer Advocate



Have you ever wanted to develop apps for the car, but found the variety of OEMs and proprietary platforms too big of a hurdle? Now with Android Auto, you can target a single platform supported by vehicles coming soon from 28 manufacturers.



Using familiar Android APIs, you can easily add a great in-car user experience to your existing audio apps, with just a small amount of code. If you’re new to developing for Auto, watch this DevByte for an overview of the APIs, and check out the training docs for an end-to-end tutorial.






Playback and custom controls
















Custom playback controls on NPR One and iHeartRadio.



The first thing to understand about developing audio apps on Auto is that you don’t draw your user interface directly. Instead, the framework has two well-defined UIs (one for playback, one for browsing) that are created automatically. This ensures consistent behavior across audio apps for drivers, and frees you from dealing with any car specific functionalities or layouts. Although the layout is predefined, you can customize it with artwork, color themes, and custom controls.



Both NPR One and iHeartRadio customize their UI. NPR One adds controls to mark a story as interesting, to view a list of upcoming stories, and to skip to the next story. iHeartRadio adds controls to favorite stations and to like songs. Both apps store user preferences across form factors.



Because the UI is drawn by the framework, playback commands need to be relayed to your app. This is accomplished with the MediaSession callback, which has methods like onPlay() and onPause(). All car specific functionality is handled behind the scenes. For example, you don’t need to be aware if a command came from the touch screen, the steering wheel buttons, or the user’s voice.



Browsing and recommendations

















Browsing content on NPR One and iHeartRadio.



The browsing UI is likewise drawn by the framework. You implement the MediaBrowserService to share your content hierarchy with the framework. A content hierarchy is a collection of MediaItems that are either playable (e.g., a song, audio book, or radio station) or browsable (e.g., a favorites folder). Together, these form a tree used to display a browsable menu of your content.



With both apps, recommendations are key. NPR One recommends a short list of in-depth stories that can be selected from the browsing menu. These improve over time based on user feedback. iHeartRadio’s browsing menu lets you pick from favorites and recommended stations, and their “For You” feature gives recommendations based on user location. The app also provides the ability create custom stations, from the browsing menu. Doing so is efficient and requires only three taps (“Create Station” -> “Rock” -> “Foo Fighters”).



When developing for the car, it’s important to quickly connect users with content to minimize distractions while driving. It’s important to note that design considerations on Android Auto are different than on a mobile device. If you imagine a typical media player on a phone, you may picture a browsable menus of “all tracks” or “all artists”. These are not ideal in the car, where the primary focus should be on the road. Both NPR One and iHeartRadio provide good examples of this, because they avoid deep menu hierarchies and lengthy browsable lists.



Voice actions for hands free operation



Voice actions (e.g., “Play KQED”) are an important part of Android Auto. You can support voice actions in your app by implementing onPlayFromSearch() in the MediaSession.Callback. Voice actions may also be used to start your app from the home screen (e.g., “Play KQED on iHeartRadio”). To enable this functionality, declare the MEDIA_PLAY_FROM_SEARCH intent filter in your manifest. For an example, see this sample app.



Next steps



NPR One and iHeartRadio are just two examples of great apps for Android Auto today. They feel like a part of the car, and look and sound great. You can extend your apps to the car today, too, and developing for Auto is easy. The framework handles the car specific functionalities for you, so you’re free to focus on making your app special. Join the discussion at http://g.co/androidautodev if you have questions or ideas to share. To get started on your app, visit developer.android.com/auto.



Kamis, 19 Maret 2015

Take your apps on the road with Android Auto

Posted by Wayne Piekarski, Developer Advocate



Starting today, anyone can take their apps for a drive with Android Auto using Android 5.0+ devices, connected to compatible cars and aftermarket head units. Android Auto lets you easily extend your apps to the car in an efficient way for drivers, allowing them to stay connected while still keeping their hands on the wheel and their eyes on the road. When users connect their phone to a compatible vehicle, they will see an Android experience optimized for the head unit display that seamlessly integrates voice input, touch screen controls, and steering wheel buttons. Moreover, Android Auto provides consistent UX guidelines to ensure that developers are able to create great experiences across many diverse manufacturers and vehicle models, with a single application available on Google Play.



With the availability of the Pioneer AVIC-8100NEX, AVIC-7100NEX, and AVH-4100NEX aftermarket systems in the US, the AVIC-F77DAB, AVIC-F70DAB, AVH-X8700BT in the UK, and in Australia the AVIC-F70DAB, AVH-X8750BT, it is now possible to add Android Auto to many cars already on the road. As a developer, you now have a way to test your apps in a realistic environment. These are just the first Android Auto devices to launch, and vehicles from major auto manufacturers with integrated Android Auto support are coming soon.



With the increasing adoption of Android Auto by manufacturers, your users are going to be expecting more support of their apps in the car, so now is a good time to get started with development. If you are new to Android Auto, check out our DevByte video, which explains more about how this works, along with some live demos.





The SDK for Android Auto was made available to developers a few months ago, and now Google Play is ready to accept your application updates. Your existing apps can take advantage of all these cool new Android Auto features with just a few small changes. You’ll need to add Android Auto support to your application, and then agree to the Android Auto terms in the Pricing & Distribution category in the Google Play Developer Console. Once the application is approved, it will be made available as an update to your users, and shown in the cars’ display.



Adding support for Android Auto is easy. We have created an extensive set of documentation to help you add support for messaging (sample), and audio playback (sample). There are also short introduction DevByte videos for messaging and audio as well. Stay tuned for a series of posts coming up soon discussing more details of these APIs and how to work with them. We also have simulators to help you test your applications right at your desk during development.



With the launch of Android Auto, a new set of possibilities are available for you to make even more amazing experiences for your users, providing them the right information for the road ahead. Come join the discussion about Android Auto on Google+ at http://g.co/androidautodev where you can share ideas and ask questions with other developers.



Kamis, 12 Maret 2015

A New Reference App for Multi-device Applications

It is now possible to bring the benefits of your app to your users wherever they happen to be, no matter what device they have near them. Today we’re releasing a reference sample that shows how to implement such a service with an app that works across multiple Android form-factors. This sample, the Universal Music Player, is a bare-bones but functional reference app that supports multiple devices and form factors in a single codebase. It is compatible with Android Auto, Android Wear, and Google Cast devices. Give it a try and easily adapt your own app for wherever your users are, be that a phone, watch, TV, car, or more!




Playback controls and album art in the lock screen.

On the application toolbar, the Google Cast icon.





Controlling playback through Android Auto





Controlling playback on an Android Wear watch


This sample uses a number of new features in Android 5.0 Lollipop, like MediaStyle notifications, MediaSession and MediaBrowserService. They make it easy to implement media browsing and playback on multiple devices with a single version of your app.


Check out the source code and let your users enjoy your app from wherever they like.


Posted by Renato Mangini, Senior Developer Platform Engineer, Google Developer Platform Team