CoreDump: Wyze Car to Class …
Gecko: in this episode we move far beyond camel , mule, and SOA, to Kafka, designing a SaaS based persistence and data mining.
Here is Kafka at Netflix.
Lisa: I am the official Wyze CarPool driver today, I remote drive the Wyze Car to class, with the rest of the pool as passengers, and we ‘spy’ on graduate classes. We lied about our age, we are not to be seen around these classes, too young I guess?
Gecko: Never too young for Tensorly! Wonder if you made enemies at AMD, Intel and other chip firms, they are marketing bad programmers, who forever pretend to know the math, but write poor ML code, and move the number crunching power, faster and faster machines for computing tensor expressions that could simply be simplified if they were math literate? No doubt Intel sells, ignorance is bliss.
Lisa: I guess all ‘good’ firms lose out! In any case, is there a way to stream Wyze Cam events, through reactive kafka, to AWS? A lambda to upload every event video stream to a kafka topic and then persist it to DynamoDB, add a nldb search, and gather notes, summaries, main points and a knowledge graph? Get Wyzer Everyday?
I envision programmers turning the wheel of technology, marketing the moore’s law, Ghandian coders and pacifist.
Gecko: I just googled the topic and found this architecture,
I webscraped this image….
It indicates an architecture of a simple java system, to do more or less what you want with the Wyze Cam,
a lambda to upload video streams from the wyze cam, one to code the topics, and fargate to persist, you need to add another lambda, with sagemaker to datamine, maybe add video summarization and automated note taking. We could modify, the lambdas to stream all wyze cloud recordings by using gists from github.
There is the eventhub by Azure, Kafka and Azure MLOps, APIOps, all from visual studio with Github CoPilot, easy coding. With eventhub, analogous to AWS, is an architecture for data lakes of Wyze classes and filters for what you want.
Ref: Event Hubs
Can then add Azure Functions to datamine the lake, for example I could create a reactive stream from the kafka topic and look for instances of my name with a summary of the relevant topics.
Gecko: That is straightforward with Azure, you have speech to text APIs, summarization APIs and a straight forward grep like search filter for your name, a context summarizer?