• 4 Posts
  • 92 Comments
Joined 1 year ago
cake
Cake day: June 4th, 2023

help-circle













  • You still need a massive fleet of these to train those multi-billion parameter models.

    On the invocation side, if you have a cloud SaaS service like ChatGPT, hosted Anthropic, or AWS Bedrock, these could answer questions quickly. But they cost a lot to operate at scale. I have a feeling the bean-counters are going to slow down the crazy overspending.

    We’re heading into a world where edge computing is more cost and energy efficient to operate. It’s also more privacy-friendly. I’m more enthused about a running these models on our phones and in-home devices. There, the race will be for TOPS vs power savings.





  • That is actually kind of brilliant. Having to go through MFi and getting the Apple DRM chip into the manufacturing pipeline can be a real pain (and expensive).

    With this scheme, they could also run all the wired on/off and volume control actions through Bluetooth AVRCP. Even have a Mic on the wire, so if a call comes in, switch to HFP to talk/manage the call.

    Damn, that’s clever. Hats off to whoever came up with it.

    Incidentally, there’s very little Apple can do to make this stop, unless they decide to break Bluetooth and third-party accessories.