neural networks feel a little too "bruteforce" to me.
At least all the ones shilled on youtube ted talk tier shit.
I know there are convolutions, but have we tried morphing the topology of the network dynamically in anyway?
>>9137768
is there anything else in Machine learning?
like.. any new tricks?
any old theories to dig up from the 60s?
like... i'm already doing stuff for chaos theory and distributed processing and blockchain stuff.
home cooked,
but like. is there anything else?
clever programming, creating specialized hardware and using numerical analysis tricks is what's new. Everything else is stuff from the 80s that didn't work back then because the computers were shit. By the way in the next year or two there will be another computer revolution where memristors will get used for machine learning harware. That will improve ML at least 10-fold compared to today.
>>9137797
yes. because we can dynamically tune the strength of the connections. yes... it will be interesting. we don't actually have to process numbers, we just send the input in and let thermodynamics do all the work... it is very nice.
There's also recursions, and different activation styles... I'm imagining there's more, but haven't actually gotten any further into it than that myself... Maybe everyone is hopelessly booring.
>>9137768
Echo Location Probabillity Distribution Tables.