AppSignal — Full-stack monitoring for errors, metrics, and logs
Full-stack monitoring for errors, metrics, and logs
Promoted
This is fascinating technology. But even as an early adopter, this UX is a bit too complex and overwhelming for me.
Report
@gregbarbosa putting stickers on objects, drawing custom workflows on the app. It all seems very DIY and unintuitive. I'm sure this is just early days for the tech, but I don't expect any significant adoption on a consumer level at this point. This type of tech is already being used in experiential marketing, but it is more as a gimmick than everyday functionality like these MIT guys are showcasing.
@rosssheingold This is delivered out of MIT Labs. I think it is early days as well, but then anyone who wants to program the objects around them won't have the same mind set as people who only user their iPhone to make calls. But I can also imagine generating and delivering complex automation applications to that kind of user as well.
Report
@rosssheingold it would seem the target audience is less early adopters / product enthusiasts, and more hardware hackers (arduino, raspberry pi), etc.
Report
@rosssheingold I completely agree. Fascinating, but cumbersome right now. Sometimes this inevitable due to the tech choice as well.
Report
@thiojoe@acelotfi I'm trying to figure out how this gets access to all the systems? Bluetooth? It's not very clear, but it's definitely a game changer.
This is like programmable devices used for computer input for things like music software or live performance by DJs. Stuff like https://www.producthunt.com/tech... or the FingerWorks iGesture pad (tech Apple bought to fuel gesture tracking on iOS devices).
If this kind of customization took off, hacking existing electronics would be a temporary burden. Manufacturers could make touchscreens to display labels and UI combined with physical input mechanisms. You car would come with some defaults, but power users could reprogram them for efficiency and to enable new functionality. Why not just have touchscreens? It's easier to turn a knob or press a physical button without looking while driving (and the tactile feedback is satisfying).
There's lots of opportunity for simplifying the programming. You could have a natural language library of voice commands. "When I do this [push button], show the subwoofer settings on the center console screen..."
What I find most interesting isn't this product and it's inevitable success as a technology (not necessarily a specific company) but how this will affect our daily lives. The possibilities of this tool will enable **everyone who makes anything** to completely reimagine their UI.
So exciting! 😆
Wowowowowowowow!!!!!!! I really haven't been this excited about tech in a long time. It would be great though if there was another way to program the functions without painting the black and white grids.
Wow. This is super interesting. The description from Reality Editor's website says "This all sounds like science fiction, but it’s not. We’ve made some of our research publicly available. You can download the Reality Editor in the App Store® and use our open source platform called Open Hybrid to build a new generation of Hybrid Objects. This vision is not only for the DIY designer and engineers, but is also fully feasible for the next generation of high-tech users."
Valentina Studio
Cover
Neuron
Operation Pie
Startup TV
VC Puzzle