RobinHood
Well-known member
Next week Google is releasing Gemini 1.5 Pro PAYG, which can take large inputs - or what they call 'large context understanding' .
Here's an example where they feed it the 3.js example code. That text file code input is just under 1M tokens ($7) to input.
You can then ask it all sorts of questions about the codebase. I think this could be an invaluable tool to both site admins and devs.
Maintaining good quality docs is incredibly time consuming, but if the code base is well commented, and the LLM has full access to the code base, a tool like this could be a million times for valuable for learning about the platform, instead of searching forums and searching docs.
I'm sure there will be other ways to create large context tools, but experimenting with this if it's easy to deploy would certainly be interesting.
You could even try feeding it the transcripts from the building with XF2 series.
Here's an example where they feed it the 3.js example code. That text file code input is just under 1M tokens ($7) to input.
You can then ask it all sorts of questions about the codebase. I think this could be an invaluable tool to both site admins and devs.
Maintaining good quality docs is incredibly time consuming, but if the code base is well commented, and the LLM has full access to the code base, a tool like this could be a million times for valuable for learning about the platform, instead of searching forums and searching docs.
I'm sure there will be other ways to create large context tools, but experimenting with this if it's easy to deploy would certainly be interesting.
You could even try feeding it the transcripts from the building with XF2 series.