Rohit Krishna’s Post

View profile for Rohit Krishna, graphic

General Partner at WEH Ventures I Pre-seed & Seed

This Apple keynote was special, much more than even when Vision pro was launched. 'Apple Intelligence' has definitely brought LLMs mainstream, as they call it AI for the rest of us. What I think Apple did smartly was instead of speaking about the technology itself, Apple focused almost half their overall presentation on smart experiences that wasn't possible before. Of course it started out as a joke like 'moving app icons on the home screen' but escalated to parts where you're pleasantly surprised as to how you might not be able to live without some of these features very soon. With 'App Intents API' other developers can use this 'personal context' in to their own products (& most probably exposing their own app data to Apple Intelligence) which just makes iOS much smarter and cements you more in to the eco-system than ever before. After watching this keynote it's quite clear which large tech company handled the LLM revolution the best, Apple > Microsoft > Google. Microsoft probably has almost all the capabilities (or even more) than Apple, but it just wasn't as consumer focused. Google of course is asleep at the wheel. One thing is clear, that Apps with LLM/GenAI capabilities are not just a hype, they are going to be table stakes henceforth. But we've been trying out to answer the difficult question of what is the moat? Folks first thought it's LLM & companies like OpenAI with the largest models. I have always been a believer of the power of distribution, just because it's a cool tech, people are not going to run to adopt it unless it's useful for them. Apple did just that & the way they handled ChatGPT in the keynote gives more subtle cues. ChatGPT was spoken about for 2 mins in the whole key note and towards the very end. But the icing on the cake was this - "We also intend to add support to other AI models in the future". Just like how Google Search had to pay the Apple tax for distribution, I think soon ChatGPT would also have to. Will write a longer note on where we believe the opportunities are for early-stage startups in this space. P.S. Could someone pls explain Private Cloud Compute? Clearly not all the compute is happening on the device, so how is it more private? They cleverly kept showing the apple logo with the lock multiple times without actually explaining how it works. #Applekeynote #LLM #ChatGPT #venturecapital

  • No alternative text description for this image
Debapriyo Mandal

Founder MBAcupid, IIM NIT

3w

I have a feeling Apple must be paying the AI Tax to OpenAI, contrary to Google Paying Apple Tax.

Like
Reply
Mayank Chawla

Building a Deeptech and EdgeAI platform that helps complex tasks of building, compiling, and porting deep learning applications on edge devices. #Generative AI, #Deeptech, #EdgeAI, #ML, #Computer Vision, #LLMs

3w

Local ML is going to be big due to data privacy and real time processing. Model inference on device is the most secure ML but building entire application on the Hardware/edge is very challenging and time consuming. Now we need fundamental models which design and develop for Edge devices.

Like
Reply

Private Secure computing on the edge is here. Could be more secure due to embedded below OS security features like advanced telemetry, auto remediation, recovery and more.

See more comments

To view or add a comment, sign in

Explore topics