Artwork

[i3] Institutional Investment Podcast द्वारा प्रदान की गई सामग्री. एपिसोड, ग्राफिक्स और पॉडकास्ट विवरण सहित सभी पॉडकास्ट सामग्री [i3] Institutional Investment Podcast या उनके पॉडकास्ट प्लेटफ़ॉर्म पार्टनर द्वारा सीधे अपलोड और प्रदान की जाती है। यदि आपको लगता है कि कोई आपकी अनुमति के बिना आपके कॉपीराइट किए गए कार्य का उपयोग कर रहा है, तो आप यहां बताई गई प्रक्रिया का पालन कर सकते हैं https://hi.player.fm/legal
Player FM - पॉडकास्ट ऐप
Player FM ऐप के साथ ऑफ़लाइन जाएं!

82: ChatGPT_Machine Learning and Venture Capital

1:16:12
 
साझा करें
 

Manage episode 359853279 series 1770598
[i3] Institutional Investment Podcast द्वारा प्रदान की गई सामग्री. एपिसोड, ग्राफिक्स और पॉडकास्ट विवरण सहित सभी पॉडकास्ट सामग्री [i3] Institutional Investment Podcast या उनके पॉडकास्ट प्लेटफ़ॉर्म पार्टनर द्वारा सीधे अपलोड और प्रदान की जाती है। यदि आपको लगता है कि कोई आपकी अनुमति के बिना आपके कॉपीराइट किए गए कार्य का उपयोग कर रहा है, तो आप यहां बताई गई प्रक्रिया का पालन कर सकते हैं https://hi.player.fm/legal
In this episode of the [i3] Podcast, we look at machine learning and artificial intelligence. Our guests are Kaggle Founder Anthony Goldbloom and Stanford CS Professor Chris Manning, who are both involved in venture capital firm AIX Ventures, as investment directors. We speak about the state of play in machine learning and artificial intelligence, the most interesting applications, including ChatGPT, and opportunities for investors in this space, covering smart sensors, travel agents and personal assistants. Overview of the podcast: Anthony Goldbloom: 04:00 The idea for Kaggle came from a conference competition 06:00 Looking for the most accurate algorithms 07:30 Before Kaggle, every academic discipline had their own set of machine learning techniques 08:00 One technique won problem after problem 09:00 Rise of neural networks: 2012 is often called the annus mirabilis for machine learning 10:30 I’m mind blown by what you can do with ChatGPT 11:30 Using summarization through ChatGPT 12:30 We are in a world right now where the capabilities of these models run far ahead of the applications. People haven’t really build companies around these models yet 14:50 The rise of chat-powered travel agents? Adding databases to ChatGPT 17:00 Why was Google interested in Kaggle? 19:00 Tweaking the value estimation algorithm for US real estate website Zillow 21:00 Surpassing physicians on diagnosing lung cancer 22:30 Two Sigma and Optiver also used Kaggle to solve problems 23:00 Hedge funds who crowdsourced investment problems 24:00 Being a one person band is hard in investing: you need to not only find the alpha signal, but also implement the trade in a way that doesn’t move the market 27:00 Does machine learning work in time series? Yes, but it requires more babysitting if your algorithm works in an adversarial setting 32:00 What I bring to AIX Ventures is the understanding of where the gaps are in the tools for machine learning 33:00 Examples of companies we invested in 35:00 Embedding ultra light machine learning into appliances Chris Manning: 37:24 I was interested in how people learn languages, while I was always playing around with computers. Then I became interested in Ross Quinlan’s ID3 algorithm for natural language processing 40:00 I started to work with large digital language databases slightly before the world wide web really kicked off 43:00 The combination of neutral networks and predictive text led to the revolutionary breakthroughs we see now with ChatGPT 45:30 You can use ChatGPT for text analysis, such as sentiment analysis or summarization of specific information 46:00 These models are just wonderful, but of course there are still problems. On occasion these models tend to hallucinate. They are just as confident producing made up stuff. And at times they lack consistency in thinking. They will say things that contradicts what they said previously 47:00 Human learning is still far more efficient in getting signal from data than machine learning 50:00 The majority of businesses are conducted through human language, whether it is sales or support. These models can help people work fast and better. 52:00 The case of Google and zero shot translation 57:00 Facebook experimented with two systems talking to each other, but they found the systems would not stick to English, but developed a more efficient symbol system 1:00:00 Interesting businesses we’ve invested in: weather prediction 1:02 A lot of computing that was previously done in the cloud is now done on the device, which is much quicker 1:04 Can NPL read the sentiment of a market by consuming just a lot of text? Well, a lot of mob mentality is expressed in language rather than numbers 1:07 The start of AIX Ventures and the two Australians 1:11 What might be the next big thing in NPL? 1:13 Future applications of language models might potentially look at video and personal assistants
  continue reading

104 एपिसोडस

Artwork
iconसाझा करें
 
Manage episode 359853279 series 1770598
[i3] Institutional Investment Podcast द्वारा प्रदान की गई सामग्री. एपिसोड, ग्राफिक्स और पॉडकास्ट विवरण सहित सभी पॉडकास्ट सामग्री [i3] Institutional Investment Podcast या उनके पॉडकास्ट प्लेटफ़ॉर्म पार्टनर द्वारा सीधे अपलोड और प्रदान की जाती है। यदि आपको लगता है कि कोई आपकी अनुमति के बिना आपके कॉपीराइट किए गए कार्य का उपयोग कर रहा है, तो आप यहां बताई गई प्रक्रिया का पालन कर सकते हैं https://hi.player.fm/legal
In this episode of the [i3] Podcast, we look at machine learning and artificial intelligence. Our guests are Kaggle Founder Anthony Goldbloom and Stanford CS Professor Chris Manning, who are both involved in venture capital firm AIX Ventures, as investment directors. We speak about the state of play in machine learning and artificial intelligence, the most interesting applications, including ChatGPT, and opportunities for investors in this space, covering smart sensors, travel agents and personal assistants. Overview of the podcast: Anthony Goldbloom: 04:00 The idea for Kaggle came from a conference competition 06:00 Looking for the most accurate algorithms 07:30 Before Kaggle, every academic discipline had their own set of machine learning techniques 08:00 One technique won problem after problem 09:00 Rise of neural networks: 2012 is often called the annus mirabilis for machine learning 10:30 I’m mind blown by what you can do with ChatGPT 11:30 Using summarization through ChatGPT 12:30 We are in a world right now where the capabilities of these models run far ahead of the applications. People haven’t really build companies around these models yet 14:50 The rise of chat-powered travel agents? Adding databases to ChatGPT 17:00 Why was Google interested in Kaggle? 19:00 Tweaking the value estimation algorithm for US real estate website Zillow 21:00 Surpassing physicians on diagnosing lung cancer 22:30 Two Sigma and Optiver also used Kaggle to solve problems 23:00 Hedge funds who crowdsourced investment problems 24:00 Being a one person band is hard in investing: you need to not only find the alpha signal, but also implement the trade in a way that doesn’t move the market 27:00 Does machine learning work in time series? Yes, but it requires more babysitting if your algorithm works in an adversarial setting 32:00 What I bring to AIX Ventures is the understanding of where the gaps are in the tools for machine learning 33:00 Examples of companies we invested in 35:00 Embedding ultra light machine learning into appliances Chris Manning: 37:24 I was interested in how people learn languages, while I was always playing around with computers. Then I became interested in Ross Quinlan’s ID3 algorithm for natural language processing 40:00 I started to work with large digital language databases slightly before the world wide web really kicked off 43:00 The combination of neutral networks and predictive text led to the revolutionary breakthroughs we see now with ChatGPT 45:30 You can use ChatGPT for text analysis, such as sentiment analysis or summarization of specific information 46:00 These models are just wonderful, but of course there are still problems. On occasion these models tend to hallucinate. They are just as confident producing made up stuff. And at times they lack consistency in thinking. They will say things that contradicts what they said previously 47:00 Human learning is still far more efficient in getting signal from data than machine learning 50:00 The majority of businesses are conducted through human language, whether it is sales or support. These models can help people work fast and better. 52:00 The case of Google and zero shot translation 57:00 Facebook experimented with two systems talking to each other, but they found the systems would not stick to English, but developed a more efficient symbol system 1:00:00 Interesting businesses we’ve invested in: weather prediction 1:02 A lot of computing that was previously done in the cloud is now done on the device, which is much quicker 1:04 Can NPL read the sentiment of a market by consuming just a lot of text? Well, a lot of mob mentality is expressed in language rather than numbers 1:07 The start of AIX Ventures and the two Australians 1:11 What might be the next big thing in NPL? 1:13 Future applications of language models might potentially look at video and personal assistants
  continue reading

104 एपिसोडस

सभी एपिसोड

×
 
Loading …

प्लेयर एफएम में आपका स्वागत है!

प्लेयर एफएम वेब को स्कैन कर रहा है उच्च गुणवत्ता वाले पॉडकास्ट आप के आनंद लेंने के लिए अभी। यह सबसे अच्छा पॉडकास्ट एप्प है और यह Android, iPhone और वेब पर काम करता है। उपकरणों में सदस्यता को सिंक करने के लिए साइनअप करें।

 

त्वरित संदर्भ मार्गदर्शिका