The app: YogiFi is a smart and personalised yoga gadget and app that delivers personalised yoga programmes, instant therapy sessions anytime, anywhere. If you buy YogiFi Mat, the app will turn into a virtual yoga instructor that provides guidance and real-time feedback on each asana.
Muralidhar Somisetty, founder and chief executive officer of Wellnesys. com, which is behind YogiFi YogiFi
Let’s meet: Muralidhar Somisetty, founder and chief executive officer of Wellnesys. com, which is behind YogiFi.
The motivation behind YogiFi Mat
The real motivation behind YogiFi came from my own personal life experience. I had severe spine injury at one point of time and yoga helped me in complete recovery without having to go through a recommended surgery. It gave me strong conviction in this ancient science of healing which eventually led me to become a certified yoga teacher. I organised free community sessions via both online and studio classes, specifically for senior citizens and kids. I also offered therapy sessions to a few of my friends living in the US and Singapore. Today, we see an overwhelming yoga content across online and app platforms. While the content may be great, it is neither personalised to an individual’s limitations nor curated based on one’s progress on flexibility and strength. While one receives better interaction in physical studio classes, most people cannot be regular with practice due to commute, busy lifestyle and travel schedules.
YogiFi’s smart yoga mat combined with a smartphone (iOS) app is now helping its users to practice yoga anytime, anywhere with real-time assistance on asanas (postures) and breathing. Users are now offered personalised wellness programmes based on their goals, limitations and past history. Users can now understand the effectiveness of the yoga session with vitals data correlated from Apple Watch. YogiFi’s mission is to integrate yoga and meditation as a daily habit into everyone’s lifestyle with immersive user experiences.
YogiFi received overwhelming response from across the world post CES (one of the world’s biggest annual tech trade shows) 2020, thanks to the Innovation Award honour and media attention. We have observed accelerated interest in YogiFi product/app during the lockdown with queries for product and app downloads happening from India and across the globe, more specifically from the US, Canada, the UK, France, Italy, South Korea and Japan. We got queries to support voice instructions on multiple languages within the app, which is planned in the roadmap for a winter release.
Krishnaprasad Jagadish, co-founder of Parjanya Creative Prayoga
The app: Prayoga is a brand new way of learning yoga. An iOS exclusive app, it uses body tracking technology along with machine learning to understand how you are performing a pose and in real time give you feedback to get to a perfect pose.
Let’s meet: Krishnaprasad Jagadish, co-founder of Parjanya Creative
When Apple introduced Body Tracking with ARKit 2.0 at WWDC 2019, the immense potential of this technology on a mobile device that is used by millions of people had our clogs turning. When we realised that the static nature of yoga poses would fit perfectly for evaluation through body tracking, we quickly put together a proof of concept to test our idea, and to our delight, it worked better than we expected.
Apple’s ARKit body tracking provides for the ability to track 91 joints in the human body. Knowing that the effectiveness of one’s yoga practice not just lies in the movement, but in the final pose being held correctly, we felt that body tracking. and hence feedback about the joints would help the user achieve that perfect pose. In the initial days, we experimented with a particular asana called Vrikshasana or ‘tree pose’.
When we processed this using ARKit’s body tracking, we were astonished at the accuracy of the tracking. We consulted experts in the field, including doctorate holders in yoga and several yoga studio owners and trainers to determine the exact joints that would need to be examined in order to effectively evaluate an asana. With this information in hand, we proceeded to creating a ‘master pose’ with all the joints in the correct formation, and any user would be evaluated in real time against this ‘master pose’. We continuously monitor the user’s progress through the device’s back camera and in real time, the app evaluates and provides feedback at each significant step that is performed during the asana.
Creating the 3D model
The process of 3D model creation can be approached using several different techniques, such as photogrammetry, laser scanning or hand modelling. Having done our initial research on the cost effectiveness and quality of each of these methods, we started with using a laser scanner to scan the object. In order to scan and get the correct depth information for a 3D model, the process involves placing an object in the middle of the room and then scanning the model all around in a semi-spherical fashion.
We started with using structure sensor by Occipital, a small but effective laser scanner mounted on an iPad. We built our own custom rig that would go around the trainer and attempted to create a laser scan and hence a 3D model of the trainer.
Adding to the challenges of building a perfect rig to carry out the laser scanning, our trainers were required to hold the most difficult of poses for even two-three minutes at a time while the rig moved around them and completed a full circle. Imagine having to hold an asana like this one for three minutes straight!
Once we had the laser scan of our trainer performing the asana, we cleaned up the model and then applied textures of clothing and colours that were captured through the sensor and also with a digital camera. Once the models were prepared, these were brought in to Apple’s Reality Composer to add behaviours, titles and also export as USDZ files into our project. This new tool from Apple really helped speed up the process of bringing a 3D model from our game engine to our iOS application.
Abhilesh Gupta, co-founder and CEO of AyuRythm AyuRythm
The app: The holistic wellness mobile app works on the principles of ancient Indian Ayurvedic method of pulse diagnosis for health assessment that can understand individual’s strength, metabolism and emotional state.
Let’s meet: Abhilesh Gupta, co-founder and CEO of AyuRythm
The technology used
Even though holistic wellness methods like Ayurveda are well developed and offers personalised solutions, it still requires an expert to assess the state of individual and offer customised solutions. One of the challenges of scaling this was lack of technology-driven objective measure to assess the holistic wellness state of the individual, which varies from morning to night, season to season as well as day to day. AyuRythm has solved this issue by easy to use, accessible, digital mobile platform. Registered users of AyuRythm need to answer 30 questions one time and take a 30-second pulse diagnostic test using their smartphone camera. User’s holistic wellness state is analysed and solutions combining yoga, breathing exercise, meditation, food and herbal remedies are suggested. The more tests the user takes, system can track more of their holistic wellness variation day to day, season to season and then offer personalised solution.
The underlying principle of the pulse diagnostic test is to measure blood flow using the smart phone camera basis finger photo plethysmography principles. From the signal, we then compute seven vital signs namely heart rate, rhythm, systolic BP, diastolic BP, pulse pressure, arterial stiffness and pulse morphology and then correlate these vital signs to Ayurvedic biomarkers. This is then used to calculate bodily humours namely, kapha, pitta and vata, which form holistic wellness index.
Advantage iOS offers
- Uniformity and consistent user experience: the Apple world is unfragmented. Our USP depends on quality of smartphone camera signal and Apple offers consistent quality across its devices and OS.
- Hyper large user base spread worldwide interested in holistic wellness using Ayurveda and yoga.
- Mobile health ecosystem, driven by HealthKit, CareKit and ResearchKit. HealthKit allows integration of activity, stress, sleep cycle, and other health parameters which can help in improving the Ayurvedic assessment.
- Apple’s peripheral ecosystem, consisting of Apple Watch and AirPods, which can help to get quality vital signs data, is critical for AyuRythm’s advanced user experience.
Importance of peripherals
- For better results, pulse diagnostic test has to be completed early morning on an empty stomach. To provide recommendations for day-to-day and seasonal variation, it is better these tests are done every day or as often as possible. Better user experience can be provided by Apple Watch as it can be programmed to measure this automatically. Once Apple allows integration with AirPods in future, we should be able to provide our users a seamless experience.
- Ayurvedic circadian rhythm is a very important concept to offer an advisory. Hence wearables can give more personalised daily routine as continuous monitoring can be done with these devices without users having to remember to take the test.
- Being less prone to motion artifact and ambient temperature variation, ears can give noiseless and excellent quality signals via AirPods which will further help users with ease of diagnosis without having to remember to take the test on the phone. This will be made available to our users once Apple opens up AirPods for integration.