Announcing the winners of the #AndroidDevChallenge, powered by on-device machine learning




Posted by Jacob Lehrbaum, Director of Developer Relations, Android




Developers like you have always played an important role in Android innovation. Over 10 years ago, when we first launched the Android SDK, we also announced the Android Developer Challenge to reward model apps and highlight new ways of solving user problems. As Android pushes the boundaries of machine learning, 5G, foldables, and more, developers continue to help shape these new frontiers. To celebrate this work, we revived the challenge in 2019, with a focus on “Helpful Innovation,” powered by on-device machine learning.




We received hundreds of creative projects, and at the end of last year, picked 10 winners who each combined a strong idea and a thirst to bring it to life. Since then, we’ve been working with those winners to help turn their ideas into reality. And today, we’re announcing the 10 winners. Some are still at the beginning of their journey but but their apps are now ready for you to download and try out! !




  • AgroDoc helps farmers diagnose plant disease and make treatment plans. [Navneet Krishna; Kochi, India]

  • AgriFarm helps farmers detect plant diseases and prevent major damage in fruits and vegetables such as tomatoes, corn and potatoes. [Balochisan, Pakistan]

  • Eskke streamlines mobile money management for people in the Congo, letting them transfer money, pay bills, buy subscriptions and essential airtime through SMS. [David Mumbere Kathoh; Goma, Democratic Republic of Congo]

  • Leepi helps students learn hand gestures and symbols for American Sign Language. [Prince Patel; Bengaluru, India]

  • MixPose is a live streaming platform that gives yoga teachers and fitness professionals the opportunity to teach, track alignment, and give feedback in real-time. [Peter Ma; San Francisco, California, USA]

  • Pathfinder could help people with visual impairments navigate complex situations by identifying and calculating the trajectories of objects moving in their path. [Colin Shelton; Addison, Texas, USA]

  • Snore & Cough helps you identify and analyze snoring and coughing, to help provide info to users seeking assistance from a medical professional. [Ethan Fan; Mountain View, California, USA]

  • Stila pairs with a wearable device, like the Fitbit wristband or a device running on Wear OS by Google to monitor and track the body’s stress levels. By monitoring stress levels over time, you have the chance to better understand and manage stress in your life. [Yingdin Wing; Munich, Germany]

  • Trashly makes recycling easier. Just point the on-device camera at an item, and through object detection, the app identifies and classifies plastic and paper cups, bags, bottles, etc. [Elvin Rakhmankulov; Chicago, Illinois, USA]

  • UnoDogs helps owners better support their pet’s wellness, providing customized information and fitness programs. [Chinmany Mishra; New Delhi, India]


Making on-device machine learning more accessible, with ML Kit and TensorFlow Lite





Increasingly, machine learning is becoming a more accessible tool to developers with limited to no background in the technology. In fact, for most of the winners of the Android Developer Challenge, this was their first foray into machine learning. That’s thanks in part to two key offerings from Google, which bring on-device machine learning into reach for millions of developers around the world.



The first is ML Kit. ML Kit brings Google’s on-device machine learning technologies to mobile app developers, so they can build customized and interactive experiences into their apps. This includes tools such as language translation, text recognition, object detection and more. Eskke, for instance, uses offline text recognition and barcode scanning from ML Kit so users can scan the QR code at a mobile money kiosk and quickly withdraw money. And MixPose uses ML Kit's forthcoming Pose detection API to detect each user’s yoga positions and movements, so teachers can provide feedback.




The other Google resource that many of the Android Dev Challenge winners used was TensorFlow Lite. This powerful machine learning framework can help run machine learning models on Android, iOS and IoT devices that would never normally be able to support them. Its set of tools can be used for all kinds of powerful neural network-related applications, from image detection to speech recognition, bringing the latest cutting-edge technology to the devices we carry around with us wherever we go. Trashly, for instance, uses a custom TensorFlow Lite model to report if an object is recyclable and how to recycle it.



Helpful innovation, such as the 10 winning apps in the Android Developer Challenge, has the potential to change the way we access, use, and interpret information, making it available when we need it, where we need it most. By working with these developers focused on helpful innovation, we hope to inspire the next wave of developers to unlock what’s possible with this new technology.



#11WeeksOfAndroid Week 2 Machine Learning with Android logo head

What’s next in Android Machine Learning week?





As we kick off the second week of #11WeeksOfAndroid, focused on Machine Learning, we will highlight new tools and resources available to Android developers. Here’s a taste of the rest of this week:




  • Tuesday - ML Kit, the turnkey ML SDK went through a major overhaul with its new on-device offering this month. Check out the substantial improvement in developer usability, CameraX support and where the platform is going next.

  • Wednesday - Custom Models. When prepackaged SDK doesn’t quite satisfy your need, tools from Android Studio, TensorFlow Lite and ML Kit might just be the answer. Aside from individual offerings, we will also highlight how they can be used together.

  • Thursday - ML design. Learn some best practices for making ML product decisions from the People + AI Guidebook. We will go behind the scenes of the Read Along app, an on-device ML app that helps grow universal literacy. Bring your whole team because everyone, including UXers, engineers, and product managers are invited!



On Tuesday and Wednesday, we will also have a “codelab of the day” so get your Android Studio 4.1 beta today, block off an hour in your schedule and take this ML journey with us!



*The apps presented here are the projects of the developers individually, and not Google.

Komentar

Postingan populer dari blog ini

Google Play Referrer API: Track and measure your app installs easily and securely

Boosting developer success on Google Play

And the 2019 Google Play Award nominees are...