Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

1) Your website, and the dialup sounds, might be my favorite thing about all of this. I also like the cowboy hat.

2) Maybe it's just degrading under load, but I didn't think either chat experience was very good. Both avatars interrupted themselves a lot, and the chat felt more like a jumbled mess of half-thoughts than anything.

3) The image recognition is pretty good though, when I could get one of the avatars to slow down long enough to identify something I was holding.

Anyway great progress, and thanks for sharing so much detail about the specific hurdles you've faced. I'm sure it'll get much better.



Glad you liked the website it was such fun project. Getting the hug of death from HN so that might be why you're getting a worse experience, please try again :)


It was disabled yesterday due to the high traffic - but I was able to connect today and after saying hello the chat immediately kicked me off after I asked a question. So unfortunately I've not been able to test it out for more than a few seconds of the "Hello, how can I help you today?"

One thing I've noticed for a lot of these AI video agents, and I've noticed it in Meta's teaser for their virtual agents as well as some other companies, is they seem to love to move their head constantly. It makes them all a bit uncanny and feel like a video game NPC that reacts with a head movement on every utterance. It's less apparent on short 5-10s video clips but the longer the clips the more the constant head movements give it away.

I'm assuming this is, of course, a well known and tough problem to solve and is being worked on. Since swinging too far in the other direction of stiff/little head movements would make it even more uncanny. I'd love to hear what has been done to try and tackle the problem or if at this point it is an accepted "tell" so that one knows when they're speaking with a virtual agent?


Please try it again when you get the chance! We were dealing with high load the past few days and are good to go again! Re: head movements, I totally agree; natural head movements are really important to make it feel more natural. The issue is controllability today, which is something we're working on as well!


Tried again today, latency seemed a little better- still a lot of interrupting himself to change thoughts.

I'm still most impressed by the image recognition - could clearly read even tiny or partially obscured print on products I held up and name them accordingly. Curious how you're achieving that level of fidelity without sacrificing throughput.


Just tried this. Most amazing thing I've ever seen. Utterly incredible that this is where we're at.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: