Llama3 Stock Market Function Calling
Function Calling with Llama 3 and LangChain
The tech world is abuzz with the release of Meta's Llama 3, and Groq is excited to serve this powerful model at industry-leading speeds! Llama 3 excels at function calling, making it an ideal choice for any function calling application. This cookbook will guide you through using Llama 3 in conjunction with Groq's LangChain integration to leverage Yahoo Finance's yfinance API for real-time stock market analysis. We'll demonstrate how to write functions to call the yfinance API from a user prompt, enabling the LLM to provide relevant, real-time information on the stock market, answering a range of questions from users
Setup
As mentioned in the introduction, we will be using Meta's Llama 3-70B model for function calling in this notebook. We are also using LangChain's ChatGroq function to define our LLM and integrate it with additional LangChain tooling. Note that you will need a Groq API Key to proceed and can create an account here to generate one for free.
Defining Tools
Now we will define two LangChain tools that leverage the yfinance API to answer user queries. Our goal is to enable the LLM to provide accurate and timely information on any stock, just like you'd get on Yahoo Finance. We'll focus on two types of information: current data, such as price, volume, and beta, and historical prices. To achieve this, we'll create two tools: get_stock_info for current information and get_historical_price for historical prices.
Each tool includes a detailed description that helps the LLM determine which tool to use and which parameters to use. In get_stock_info, we list all the keys available in data.info to ensure that Llama 3 selects the correct key verbatim. In get_historical_price, we explicitly explain the purpose of start_date and end_date and provide guidance on how to fill them. In both functions, we've found that Llama 3 is capable of identifying the correct stock symbol given a company name without additional prompting.
Using our Tools
Now we will chain our tools together and bind them with our LLM so that they can be accessed:
Let's test our function calling with a few simple prompts:
[{'name': 'get_stock_info', 'args': {'symbol': 'META', 'key': 'marketCap'}, 'id': 'call_3xm9'}]
[{'name': 'get_stock_info', 'args': {'symbol': 'AAPL', 'key': 'volume'}, 'id': 'call_2p2z'}, {'name': 'get_stock_info', 'args': {'symbol': 'MSFT', 'key': 'volume'}, 'id': 'call_hvp4'}]
As you can see, in our first query we successfully called get_stock_info with parameters META and marketCap, which are valid stock symbols and keys, respectively. In our second query, the LLM correctly called get_stock_info twice for Apple and Microsoft.
[{'name': 'get_historical_price', 'args': {'symbol': '^GSPC', 'start_date': '2021-04-23', 'end_date': '2024-04-23'}, 'id': 'call_k06n'}]
[{'name': 'get_historical_price', 'args': {'symbol': 'GOOGL', 'start_date': '2023-01-01', 'end_date': '2023-12-31'}, 'id': 'call_ca9y'}, {'name': 'get_historical_price', 'args': {'symbol': 'AMZN', 'start_date': '2023-01-01', 'end_date': '2023-12-31'}, 'id': 'call_h6q6'}]
Our tool calling LLM also correctly identified get_historical_price for historical price questions, and appropriately called it twice. Note that to perform any kind of lookback analysis, you'll need to provide the current date.
Putting it all together
This function, plot_price_over_time, is not called by the LLM but will plot historical price over time if get_historical_price is called:
Finally, we will use LangChain to tie everything together. Our system prompt will provide the current date for context, and our function will execute each subsequent tool that's been called. It will also send the output back to the LLM so that it can respond to the user prompt with relevant information, and plot historical prices if that's what was asked for:
'The beta for Meta stock is 1.184.'
'A historical stock price chart for GOOGL and AAPL and META has been generated.'
Conclusion
In this notebook, we've demonstrated how to harness the power of Groq API's function calling with Llama 3 and LangChain integration. Llama 3 is an impressive new model, and its capabilities are amplified when combined with Groq's exceptional LPU speed! To explore the interactive app that accompanies this notebook, please visit: https://llama3-function-calling.streamlit.app/