|
|
|
|
|
|
|
|
|
|
Submitted by srchvrs on Tue, 10/21/2014 - 15:30
Michael Jordan gave a very interesting interview on big data, singularity, Turing test, P=NP, and artificial intelligence. Yann LeCun's reaction: "Michael Jordan, like some of us, has strong opinions about certain things."
Below I summarize what seem to be the main points made by Michael Jordan (item 7 is my favorite):
- We don’t know how neurons learn.
- Neurons in artificial neural networks do not mimic real neurons: "Anyone in electrical engineering would recognize those kinds of nonlinear systems. Calling that a neuron is clearly, at best, a shorthand. It’s really a cartoon."
- The number of hypothesis is enormous. Thus, a multiple comparison/testing problem can be really an issue in large-scale data mining.
- He predicts a " ... big-data winter. After a bubble, when people invested and a lot of companies over-promised without providing serious analysis, it will bust. "
- Solving the problems will take decades during which we will improve steadily. Yet, there has not been a significant technological breakthrough made.
- "Despite recent claims to the contrary, we are no further along with computer vision than we were with physics when Isaac Newton sat under his apple tree."
- If I had an unrestricted $1 billion grant, I would work on natural language processing.
- Singularity is not an academic discipline.
- On the Turing test: "there will be a slow accumulation of capabilities, including in domains like speech and vision and natural language. " In that, the Turing test does not seem to be "... a very clear demarcation."
|
|
|
|
|
|
|
|
|
|