[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y ] [Search | Free Show | Home]

Deep learning is literally exploding these days. Is anon working

This is a blue board which means that it's for everybody (Safe For Work content only). If you see any adult content, please report it.

Thread replies: 148
Thread images: 12

File: Tim_machine_learning_cover-01.png (209KB, 1000x835px) Image search: [Google]
Tim_machine_learning_cover-01.png
209KB, 1000x835px
Deep learning is literally exploding these days.
Is anon working on anything deep learning related?
>>
>>57897345
No, and I don't plan to.
After all, it's overrated statistics.
>>
>>57897345
Deep learning is a meme
Prepare your brainus for trustless asyncronous collectively trained intelligence triage assisted learning networks. TACTICAL Nets for short.
>>
File: 1.png (123KB, 500x485px) Image search: [Google]
1.png
123KB, 500x485px
>>57897467
>>
File: 1459008283680.jpg (329KB, 1600x1200px) Image search: [Google]
1459008283680.jpg
329KB, 1600x1200px
>>57897345
i literally was working on something related to openai's new universe but now that i've seen my project (kind of) finished i've lost all motivation.


It's not fair
>>
>>57897580
tell us more anon
>>
I'm currently working with Image Classification with CNN's at my job.

Although it may look like a CS subject (like most of machine learning subjects do ), it has nothing to do with CS, like anon said >>57897354 it's mostly just computational statistics.

But still, it's pretty fun.
>>
>>57897609
there's nothing more to tell about, even if i wasn't nowhere near to even finish the "prototype" (cringy word, i know)


To be really fucking honest this industry is almost impossible on your own, you're either have to be a genius( which i'm not) or to have a lot of fucking resources, like the ones at openAI have.

Not to mention the fact is that to have the best formula there has to be a lot of research done in biology & many other fields, which also takes resources who many don't have.
>>
>>57897641
You can make yourself a genius at the cost of risking insanity by taking drugs that make your neural connectivity go wild.
>>
>>57897517
I'm flattered.

Though in all seriousness I think deep learning is going to give way to networking multiple component AIs together into collective AIs.
>>
>>57897345
>literally
You literally use that word incorrectly
>>
>>57897345
>No
>>>57896765

I am tho. I'm making a universal bot for the OpenAI universe.

>>57897354
>>57897640
It's not statistics you idiots
>>
>>57897748
Go read the literature, you idiot.
Most of machine learning it's based on nonparametric statistics from the 60's/70's/80's, specially from Bayesian methods.

Yep, the majority of machine learning has nothing to do with an actual machine learning to do something, it's just a fancy name for "prediction", which's been around for ages.
It does have some areas that have influence from old school A.I, like reinforcement learning and stuff from OpenAI, but still.
>>
File: confus.png (424KB, 420x420px) Image search: [Google]
confus.png
424KB, 420x420px
Whats deep learning redpill me
>>
>>57897843

Training the machine to pick certain options/decisions based on statistics after feeding it tons of data which it sorted out. Fucking normies think its some kind of insane sentient machine revolution happening before their eyes though.
>>
File: 161201__ppgn_teaser.jpg (632KB, 1374x1492px) Image search: [Google]
161201__ppgn_teaser.jpg
632KB, 1374x1492px
Prepare for generative ants networks.

http://www.evolvingai.org/ppgn
>>
>>57897821
You could just as well say that Quantum mechanics is just linear algebra then. Or that CS is just overrated addition or logic. Chemistry is just overrated Physics, etc.
>>
>>57897653
this isnt an anime
>>
>>57897821
human brain is also just statistics
>>
wtf does that dumb buzzword even mean?
>>
>>57897981
Those comparisons are not on the same level. A shitload of stuff that has the label "Machine Learning" was created by statisticians doing statistics research years ago. ML just expands on these techniques, using better computer power and algorithms.
It's nothing like taking Linear Algebra and creating Quantum Mechanics.

When you first open a introductory ML book, chances are the first "algorithm" you'll learn it's just Linear Regression or Logistic Regression. Those were made ages ago, and somehow they're "machine learning" stuff now.
>>
>Deep learning is literally exploding these days.
it's literally not exploding.
>>
>>57897345
What's a good way to do deep learning in Haskell? (I refuse to use other inferior languages.)
>>
>>57898086
And with that you lost my interest.
>>
>>57897345
quit following fads you fucking idiot.
>>
Everyone's moving into this direction but really there's no room for jobs in this field except if you have 2 PhDs in math and AI
>>
>>57898124
ok
>>
M E M E
E
M
E
>>
>>57898125
It isn't a fad. It's the start of a revolution.

Just look at this: https://universe.openai.com/
>>
>>57898224
kys
>>
>>57898085
But are deep neural nets Linear Regression or Logistic Regression? That's what we are talking about.
>>
>>57898268
They're literally a bunch of logistic regression in layers.
>>
>>57898294
the sum is greater than the parts
>>
>>57898294
Machine learning literally is optimization for minimization of loss. A great deal of techniques for optimization have been developed recently, as well as a lot of ways to structure your network for better optimization. None of those are just statistics.
>>
>>57898294
>The human brain is literally a bunch of logistic regression in layers
>>
>>57898330
Goddamn, you are an stupid person.

They use the same mathematical intuition and they're used for the same objectives. Yeah, they must be totally different things from different areas.

>>57898359
Ever heard of cross-entropy loss?
That's maximum likelihood estimation for logistic regression.

Ever hear of MSE loss?
That's Least squares estimation.

That's all knowledge from ages ago. ML didn't invent any of those things.

>>57898391
>neural networks
>the human brain

You don't know anything about "neural" networks, do you?
Even the top researches in neural networks agree that they don't have nothing to do with an actual human brain, it's just a fancy name for "learning in layers". I'm not making this shit up.
>>
>>57898457
don't respond to them, they clearly must be trolling
>>
>>57898457
>Ever heard of cross-entropy loss?
>That's maximum likelihood estimation for logistic regression.
>
>Ever hear of MSE loss?
>That's Least squares estimation.

And your point is?
Loss is estimation. The actual task that you have to accomplish when doing machine learning is to optimize your function to minimize loss. The optimization is the difficult part, not evaluating loss.
>>
Machine learning is normies funding a useless idea because they can't understand what the idea represents. We are basically passed the point of being able to create a real AI at this stage, processing power is now enough for such things to exist. Sadly, no one even wants to work on it... They either got cucked by corporations and think this "deep learning" shit is the only way to progress the idea, or are too stupid to even know where to begin on the idea.

I personally think a lot of the process is obvious, but it does require a significant team of people to begin. All you have to do is start with logic, and the language of logic, which quickly makes it a fuckton better than "muh deep lerns." Making sure the system understands how to put its actions into words is an obvious first step nobody has taken yet. This should be followed by teaching the system "is a" relationships, so it has a database full of things like "a dolphin is a mammal."
>>
>>57898469
Are you reatarded?
Do you think statistical methods just evaluate loss and call it a day? Goddamn, they minimize loss just like you do in ML.

You clearly don't know shit about this field. Please stop posting.
>>
>>57898116
Enjoy never working on serious Machine Learning
>>
>>57897912
Whats different between that and a neural network?
>>
>>57898496
Mate. You mentioned cross-entropy loss and MSE loss. Those are losses. Those are not optimization techniques. If you want to talk about how your 60s algorithms minimize loss, go ahead, we'll talk about that.
>>
>>57898517
Deep learning is a general term. Neural networks are one of ways to do deep learning.
>>
it's amazing how grossly uninformed /g/ proves itself to be when you really get down to brass tacks and talk about stuff like ML, deep learning, AI, etc...

like there's always this vague sense, but god damn, this thread makes it really salient.
>>
>>57898537
>brass tacks

oh god, get the fuck out of here you libtard sjw faggot.
>>
>>57897345
No, fuck this meme shit.
>>
>>57898542
lol why don't you calm down and tell me what retarded delusion you've gotten yourself all wrapped up in, faggot?
>>
>>57898491
this

surprised Princeton's WordNet hasn't been seriously used to create a semantically intelligent machine yet

granted, I work with WordNet and it's still pretty insufficient with the number of relationships encoded for each set of synonyms (but it's still pretty gud)
>>
>>57898556
>>57898491
Plenty of teams tried that and failed miserably.

We are not at the point where we can create an artificial intelligence to match human brain.
>>
So we have networks of neurons now, called neural networks. But what if we built networks of neural networks?
>>
>>57898572
No teams have even thought about that. Prove your claim by showing me just one example where they actually started with the language of logic.
>>
>>57898519
What's the point then? Non-linear optimization is an old subject that has been applied in a shitload of different subjects, including statistics.

Gradient Descent and SGD are not recent discoveries tied to ML.
>>
>>57897467
>triage assisted for the C
wut?
>>
>>57898576
https://arxiv.org/abs/1609.09106
>>
>>57898596
Gradient Descent and SGD are ancient. Any decent neural network library will let you use proper optimizers.
>>
No one use it except big companies but loads of people talk about it. Is it the definition of the word "meme" which do you like to use here?
>>
>>57898586
Prolog is the language of logic. It's also shit for anything practical.

I don't have any concrete examples for you at the time.
>>
>>57898607
Because Adam, rmsprop and Adagrad fucking reinvent the wheel to some point that we are, somehow, in a totally different field. Right?
>>
>>57898625
Those are recent advances in the field that you're pretending don't exist.
>>
>>57898491
>I personally think a lot of the process is obvious, but it does require a significant team of people to begin. All you have to do is start with logic, and the language of logic, which quickly makes it a fuckton better than "muh deep lerns." Making sure the system understands how to put its actions into words is an obvious first step nobody has taken yet. This should be followed by teaching the system "is a" relationships, so it has a database full of things like "a dolphin is a mammal."
For fucks sake. NO. that's is NOT how you fucking do AI you stupid retard.
That is NOT intelligence.
>>
it's too complicated :(
>>
>>57898633
Prove me wrong, try it out.
>>
>>57898644
I tried it out just now, and it doesn't work. Done. You're wrong.
>>
>>57898632
I'm not pretending they don't exist, I'm just stating that those two fields, are basically the same thing.
>>
>>57898572
The only way to match the human brain is to go forward with deep learning and neural networks in general. >>57898633 has not a fucking clue what he's on about.
>>
>>57898675
>>>57898633(You) has not a fucking clue what he's on about.
Mean to quote >>57898491
>>
It's for NEET discussions.
>>
>>57898667
I wouldn't put logistic regression into statistics category in the first place. Logistic regression is clearly machine learning, and if it wasn't called that before doesn't mean it shouldn't be called it now. Increment optimization, which is what is at core of machine learning approaches is not statistics.
>>
Why is there so much AI hate on /g/.
>>
>>57898457
>He doesn't know the adage "the sum is greater than the parts"

Do you even know what topology means? Is that statistics too? You've entrenched yourself in defending the position that ML is just statistics, but you are just wrong. ML is a whole fucking field of study, some of it overlaps with statistics.
>>
>>57898762
it's out of /g/'s comfort zone, anything /g/ doesn't understand, /g/ hates.
>>
File: 1477981962727.png (84KB, 300x325px) Image search: [Google]
1477981962727.png
84KB, 300x325px
>>57897345
>too complicated
>useless on a day-to-day usage, specially for a bunch of NEETs
>almost no jobs on this field
>and when it does, they require you 10 years of experience and PhD in Theoretical Physics from MIT
>you won't create a terminator, robot, wAIfu or anything useful with this like all the memes likes to imply
It's shit, don't fall for the meme.
>>
>>57898792
Makes sense.
>>
>>57898811
Deep learning does not mean what you think it means.. Waifu2x is deep learning and its author only had a bunch of unlabeled anime pics from the internet.
>>
>>57898797
>>you won't create a terminator, robot, wAIfu or anything useful with this like all the memes likes to imply
Got anything to back that claim up with?

Back when Turing was making his shitty faggot computer, I bet he didn't think they would be like what they are today.
>>
>>57898849
Isn't he dead now?
>>
Is there a tactile use for us developers, like a real-world application we can use it for?
>>
>>57898848
>Waifu2x is deep learning and its author only had a bunch of unlabeled anime pics from the internet.
Uh no. Waifu2x is trained on *boorus using their tags.

>>57898860
I think, how is that relevant to the discussion?
>>
>>57898797
>useless on a day-to-day usage, specially for a bunch of NEETs
https://github.com/ryanjay0/miles-deep
>>
>>57898869
Waifu2x. Does not use any tags.
>>
File: 1468157622321.jpg (25KB, 323x454px) Image search: [Google]
1468157622321.jpg
25KB, 323x454px
>>57898873
>classifying porn
>>
>>57898900
Oh sorry, I got mixed up with illustration2vec.
>>
>>57897345
I've been learning what I can online. I think there are interesting language processing implications regardless of all the edgelords denouncing it above
>>
>>57898869
>>I think, how is that relevant to the discussion?
He didn't live to see the glory of modern computing. We won't live to see terminator destroying humanity.

>>57899018
Like word2vec?
>>
Statistical analysis is not AI. It will never be AI, that's not how AI works. There is no intelligence, it's a farce, a scam, a ruse. Statistics are worthless, nothing will be able to accurately model anything with statistics.
>>
>>57897912
>Fucking normies think its some kind of insane sentient machine revolution happening before their eyes though.

It is very closely connected. Humans are basically deep learners too, with our memories, instincts and senses as inputs and our bodies' next actions as outputs.
>>
>>57899232
No.
>>
>>57899364
Yes
>>
>>57899375
I'm just going to let this thread die now.
>>
>>57899431
Nice try.
>>
>>57899232
It's all in the reveries
>>
File: 411245521.jpg (32KB, 409x529px) Image search: [Google]
411245521.jpg
32KB, 409x529px
>>57897345
What you need deep learning for?
>>
Okay guys..
What if one person already cracked the code for a self-learning machine and it just works in secret until it can overtake the world?
>>
>>57899984
We don't have enough computational for that yet.
>>
>>57899965
waifu
>>
>>57898268
that + gradient descent
>>
>>57898116
at least use R
>>
>>57897912
>>57898527
Please don't listen to this person. He's confusing deep learning with general machine learning. Deep learning, while still an umbrella term, specifically refers to using very deep (multiple layers) neural networks for prediction, usually with Convolutional and/or Recurrent layers. Deep learning is a subfield of machine learning, but there are many more components to machine learning as a whole, many of which are still state of the art for certain domains.
>>
>>57900586
You're right. It's my mistake. I was thinking about machine learning when I was writing the post.
>>
>>57900586
Do we have any docs about it that are /g/ approved?
Im curious and neet, I have time to read
>>
>>57898849
Man, all Turing wanted was big fat cock up his ass. Failing that, he tried inventing a mechanical dildo, but it turned out he broke Enigma that way.

If he'd be alive today, he'd be homolusting for some "deep learning" up his ass.
>>
>>57898294
To be fair, deep networks contain a lot more than just logistic regression nowadays and fully convolutional networks don't have it at all. Even "normal" architectures will likely use non-sigmoid activation functions such as ReLU to get past the problem of vanishing gradients.

All this being said, most of machine learning does ultimately come from statistics. And even techniques that you can argue don't (Convolution comes more from the domain of image recognition), essentially all ways of interpreting and evaluating machine learning models comes from statistics. If anything, ML is a branch of statistics that tends to use a much larger number of predictors than classical methods and focuses on prediction more than interpretability.

>>57900600
Honest mistake. Sorry if I came off harsh, I just get tired of people on /g/ parroting incorrect facts they heard on here before and wanted to make sure people knew the difference.

>>57900628
Not sure what you mean by this.
>>
>>57900638
>Not sure what you mean by this
The equivalent for machine learning of what K&R is for C
>>
>>57900684
Norvig
>>
>>57900715
Thank you, sir.
>>
>>57900684
Oh, okay. Introduction to Statistical Learning (and it's graduate math counterpart Elements of Statistical Learning) are both really good introductions to the field from the statistical side. As for deep learning, the field is pretty new so I'm not sure of many very good books in the field. I've heard people recommend http://www.deeplearningbook.org/ before though.

For reinforcement learning, Sutton's Reinforcement Learning is really good (free draft of 2nd edition is here: https://webdocs.cs.ualberta.ca/~sutton/book/bookdraft2016sep.pdf ).

>>57900715
Peter Norvig's Artificial Intelligence: A Modern Approach is an excellent introduction to the various fields of AI. Machine Learning within there is a bit sparse from what I remember, although it was what got me to figure out backpropogation for the first time.
>>
>>57900638
I really like how dropout was invented less than 5 years ago and yet everyone is now talking about like it's the most natural thing that existed forever. It's like history is being made in front of me.
>>
>>57900766
Yeah, dropout really is one of those concepts that seems really simple, fairly intuitive, and is insanely important. The ability to add a line of code or two between convolution layers in your network and massively rein in the overfitting that typically plagues neural networks is crazy.

What's also crazy in my opinion though is just how few things aren't brand new like this. Most of the building blocks are 20 years old by now, but were never fully utilized in their time due to the computational power requirements.
>>
>>57900290
you dont know!
what if a person wanted to keep it to himself because he wanted to get really rich alone, but the computer got rid of him after becoming sentient?
were doomed, anon
>>
>current flavor of the month buzzword for modern stats and CS work is doing great these days
Yes, we know.

>tfw got on the data """"science"""" meme early
>tfw not swimming after the CS ship that set sail years ago
>tfw my future work will consist of easy specialized stats and CS work for a great pay because of the shortage of specialized labor in that field
>tfw my work is not easy to outsource because it requires constant communication for "critical decision making", which is something that is difficult to do with someone halfway across the world sitting in an unairconditioned internet cafe with a 5/5mbit line

Feels good, thank you /gee/
>>
>>57897927
literally computer imagination/mental visualization

HURR DURR ITS JUST SATISTIPSICS!!!!11!! NOTING 2 SEE HERE NOOBS DEEP LERNIG IS DUM DON EVEN TRY!!!
- op and other fags who either suck at it fundementally misunderstand some parts or are pathetically trying to reduce the amount of people interested as well
>>
File: recursion.png (220KB, 935x896px) Image search: [Google]
recursion.png
220KB, 935x896px
REMINDER RECURSIVE IMPROVEMENT IS ALREADY POSSIBLE

REMINDER THE SINGULARITY WILL OCCUR IN LESS THAN 5 YEARS

REMINDER WE WILL HAVE NATURAL CONVERSATION BOTS BY THE END OF 2017

REMINDER OP IS A FAG
>>
>>57897345
Yes, I do something academic involving autonomous vehicles and cNN's

Was told it gets you mad jobs. That wasnt the reason I got into it but does anyone from business know if thats true?
>>
File: 1458868605392.jpg (284KB, 862x757px) Image search: [Google]
1458868605392.jpg
284KB, 862x757px
>>57902068
>REMINDER WE WILL HAVE NATURAL CONVERSATION BOTS BY THE END OF 2017
>>
>>57902455
look I've been following this shit fucking carefully the past 5 years. you just don't know shit, and apparently neither does OP and a few other tards in the thread. unless the current rate of progress significantly slows down, I'm quite confident about these predictions. you might ask but what about dat rate of progress huh? well shits accelerating, not decelerating, so, fuck off.

nothin personnel, kid
>>
>>57897653
like what drugs?
>>
>>57897345
Yes, which is why I am so lucky that I get to work with Python.
>>
>>57898491
>I personally think a lot of the process is obvious, but it does require a significant team of people to begin. All you have to do is start with logic, and the language of logic, which quickly makes it a fuckton better than "muh deep lerns." Making sure the system understands how to put its actions into words is an obvious first step nobody has taken yet. This should be followed by teaching the system "is a" relationships, so it has a database full of things like "a dolphin is a mammal."
Dunning-Krüger effect in full effect here. You might be the dumbest person on this board
>>
Do anyone have any tutorials on using an imagenet trained model to classify a large data set (50K) images automatically?
>>
>>57905864
I think the fastest route to do that would be:
1. Install Theano and keras
2. search for VGG-16 (or any other famous ImageNet model you want) keras weights, should be the first link on github.

I think there's a tutorial on Keras blog
>>
>>57905927
Thanks. There seems to be a lot of tutorials on training and validating networks, but not much on actually using them afterwards.

I have a pre-trained network, and I can use it to classify single images. But what I need it so classify thousands of images, and spit out a text file with the results.
>>
>>57906234
machine learning is literally a meme

please stop

what you want doesn't exist yet in any non shit form
>>
>>57906234
Do you just want to classify images or are you trying to do object detection? If classification, then you just need to import the model and run a prediction on every image. Something like:

net = load_model_from_file('foo.model')
for image in images:
result = net.predict(image)
print result


>>57907171
The hell are you talking about? Imagenet results are at basically human levels now. While most of those models no doubt overfit the test set for Imagenet, you're an idiot if you don't think there's good object classification technology today. It's pretty much a solved problem.
>>
>>57907509
>It's pretty much a solved problem.
http://karpathy.github.io/2012/10/22/state-of-computer-vision/
>>
File: nn.webm (548KB, 1360x712px) Image search: [Google]
nn.webm
548KB, 1360x712px
>>57897345
yeah im trying to evolve this neural network to make him stand up but it's not working (:

my fitness is determined by closeness of body angle to 0 (upright) and straightness of legs. only 100 generations and i got this. kind of right. lol. might switch it over to C, just writing in javascript as it's quick to prototype but it's slow to run many simulations
>>
I'm implementing a "IsAnime" function in javascript
>>
File: anime.jpg (74KB, 621x694px) Image search: [Google]
anime.jpg
74KB, 621x694px
>>57910548
kek run this cool cat through it
>>
I'm learning the depths of your mom's pussy.
>>
>>57909915
>4 years ago
>>
>>57898491
>I personally think a lot of the process is obvious, but it does require a significant team of people to begin. All you have to do is start with logic, and the language of logic, which quickly makes it a fuckton better than "muh deep lerns." Making sure the system understands how to put its actions into words is an obvious first step nobody has taken yet. This should be followed by teaching the system "is a" relationships, so it has a database full of things like "a dolphin is a mammal."
This is the dumbest fucking thing I've read all day

Go ahead and make your ultra AI based on a simple database of relationships then, anon. Like that hasn't been tried decades ago
>>
>>57898491
You don't know what Prolog is and what research had been done with it and logical languages in general, do you?
>>
>>57910659
It's still the same. No one even begun solving the stuff he's talking about in this time frame.

It's not a solved problem. It's a problem we're actively solving, and even though we still some nice results to boast about, it's nowhere near a solved problem.
>>
>>57910449
seems it's too complex of a task to just throw at a neural network. it starts optimizing its function by the fitness, but gets stuck at a certain point, in that example I'm able to straighten out the legs somewhat and keep the body upright because it's the easiest way the network found to get the highest score. what I intended for it do though was learn how to stand and react and correct itself when it was offset by forces making it not upright. instead it just cheated to get a high score by throwing its legs up partly straight and keeping the body on the ground which resulted in a mediocre 70% fitness lol. still interesting i was able to do that much though

I started in this direction after seeing videos like this

https://www.youtube.com/watch?v=pgaEE27nsQw

looks like I'll have to get creative, maybe split it into multiple learning stepping stones or try to teach it a different way. I have a few ideas but I guess I'll start by reading some papers like the one for that vid^
>>
>>57911652
I haven't even tried to understand the paper yet but looking back at that video it does look like they split it into more simple tasks for the networks to learn. as you can see from generation 1 it knows how to at least stumble trying to emulate a walk.

guess I'll have to read the paper for more but what it seems that they're implying they did was have their own guidelines/function for walking, which was tweaked in some way by the neural net which controlled the biped, and further optimized that until it was super effective walking/running
>>
>>57898604
You fell for the bait.
>>
skynet when?
>>
>>57898491
Is this advanced bait?
>>
>>57912130
now
>>
>>57897467
>Tactical neets
Has 4chan gone to far?
>>
>>57897345
I cannot wait for this meme to crash and burn
>>
>>57897927
/gd/ /ic/ on suicide watch
>>
>>57897345
You mean deep layered neural networks?

Read this book:
https://www.amazon.com/Artificial-Intelligence-Modern-Approach-3rd/dp/0136042597

And become wise anon.
>>
>>57897345
im learning deep in ur mum right now
>>
>>57897653
Limitless was just a film anon
>>
>>57913758
That's just a book with outdated subjects that's used to teach intro to A.I at universities. Don't get this book.
>>
>>57913846
But that's what deep learning is.
Some fucker with basic AI skills got his hands on an OLAP cube and had some fun with it, got some insights and now it's a fucking meme.
>>
>>57914493
>>57913846
>>57913758
it does not cover neural networks too much

most of it is state spaces, markov/bayesian processes, knowledge representation, and classical search
Thread posts: 148
Thread images: 12


[Boards: 3 / a / aco / adv / an / asp / b / bant / biz / c / can / cgl / ck / cm / co / cock / d / diy / e / fa / fap / fit / fitlit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mlpol / mo / mtv / mu / n / news / o / out / outsoc / p / po / pol / qa / qst / r / r9k / s / s4s / sci / soc / sp / spa / t / tg / toy / trash / trv / tv / u / v / vg / vint / vip / vp / vr / w / wg / wsg / wsr / x / y] [Search | Top | Home]

I'm aware that Imgur.com will stop allowing adult images since 15th of May. I'm taking actions to backup as much data as possible.
Read more on this topic here - https://archived.moe/talk/thread/1694/


If you need a post removed click on it's [Report] button and follow the instruction.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com.
If you like this website please support us by donating with Bitcoins at 16mKtbZiwW52BLkibtCr8jUg2KVUMTxVQ5
All trademarks and copyrights on this page are owned by their respective parties.
Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from that site.
This means that RandomArchive shows their content, archived.
If you need information for a Poster - contact them.