The Official Ionic Blog

Build amazing native and progressive web apps with HTML5


Let Me See, the app that won the Innovation Award at the 2016 Angular Attack hackathon, was built with Ionic 2 in 48 hours by Wassim Chegham, a Developer Advocate at SFEIR in Paris and a Google Developer Expert in Web Technologies, with help from Attila Csanyi and Uri Shaked.

Let Me See’s main purpose is to help people with sight loss “see” what is around them by reading text and analyzing colors out loud.

It’s also a Progressive Web App, which we were pretty psyched about, since we’ve been wanting to find a great PWA example to show all of you!

“Let Me See is an awesome example of a PWA built with Ionic 2,” says Ionic support engineer Justin Willis, who helped judge the hackathon. “It includes everything needed to be a PWA—a service worker and a manifest.json—which means you can use it while offline and add it to your homescreen to get that native feel. My favorite thing about Let Me See is that it shows just how easy it is to build a PWA with Ionic 2 and that it shows a major strength of hybrid apps: the ability to not only run as a native app, but run just as well as a web app.”

Chegham has been using Ionic 1 for several years, whenever his team needed to build robust AngularJS mobile applications for clients or for internal apps. He’d been wanting to try Ionic 2 and thought the hackathon was a perfect opportunity.


“Getting started with Ionic 2—and Ionic 1—is a matter of minutes, if not seconds, thanks to the CLI, which helped us scaffold the app,” says Chegham. “This is crucial for a hackathon; you don’t want to spend your time debugging your build system or messing with the configuration.”

Chegham’s application UI uses simple components, since it’s a voice driven interface, so he used basic UI components, like the toolbar and card components.

“We may add more features in the coming versions and may then take advantage of some more advanced UI components, like Action Sheets or Gestures, but we need to think deeply about that, because we have to offer the best UX possible for our target audience,” he says. “The app uses standard HTML5 APIs, such as Navigator.getUserMedia() for capturing the user’s voice, but more importantly, capturing the outside world video for image processing, which is the whole purpose of this app.”

The Google Cloud Vision API made image processing possible. Let Me See also uses other graphic APIs, like Canvas and SVG, for some parts of the UI.

“All those Web APIs were so easy to integrate, thanks to Ionic, since it is by design a web-friendly platform,” Chegham says. “Adding support for Progressive Web Apps was really straightforward.”

Chegham wanted the app to be installable and available offline, so he added a manifest.json file with necessary details like the app icon, theme color, name, etc., and a Service Worker for offline support.

“Besides, Ionic comes with Gulp as a build system and allows you to easily drop in your custom tasks,” he adds. “I found this really convenient because I could add my custom “sw-precache” task to the “build:after” lifecycle, which basically generates the Service Worker after each build and keeps it up to date.”

For this first iteration, the team’s MVP was to deliver a working, easy-to-access application, so they focused on the Web platform to make sure it worked as intended. The app is still in the proof of concept stage and not available for production use.

“The innovation of Let Me See and use of modern web api’s is what first impressed me,” says Willis. “Let Me See shows you what is possible with the modern web and how far it has come from just static web pages.”

If you’ve built a PWA with Ionic 2, we’d love to hear about it! Email us your examples!

  • Himalay

    Where can I try the app?

    • Wassim Chegham

      For now, I am hosting the app on firebase:

      • Holla

        Hi Wassim, can you share any details on how you have integrated firebase with offline capabilities in your PWA? Because the firebase js library has no offline storage / fallback.

        • Wassim Chegham

          this first iteration doesn’t implement firebase offline capabilities. Remember, the hackathon was only 48h 🙂
          However, you can look at this doc:

          • karim

            Great work Wassim!!!
            I loved it.
            I actually wanted to do something like this as I have friend who is blind. Glad to see it’s possible 🙂

          • Wassim Chegham

            Thank you Karim.

  • Marcus VBP

    Where can I try the app?[2]

  • http://AutomateLife.TV Mike Lallemont

    How did you implement text to speech? Does it work on iOS?

    • Wassim Chegham

      I am using which is basically a native HTML5 implementation. The app is still a prototype for now and we only support chrome.

  • Dustin Jones

    Not to rain on the parade, but I’m not sure having an ‘app’ that doesn’t work at all on iOS for a platform who’s sole intention is to enable ‘easy’ cross-platform development really counts as a big win…

    • Wassim Chegham

      The app is using cutting edge HTML5 API which I’m afraid are not yet supported on iOS. also please keep in mind that the app is still a prototype and we only support chrome for now.

    • Nkansah Rexford

      Chrome and Android FTW!

  • Prabhash Choudhary

    Can we share the source code? Thanks