Thanks for bring this into the conversation. I will post the original article and a better response below.
The crux here is that LLMs change over time. Therefore, the same input words do not yield the same exact outputs over time. Competencies don't necessarily change --- but the LLMs requires different inputs in order to get the same results --- and it is up to humans to figure that out.
The issue for schools is: What is your streamline a research protocol for students --- and a search process no longer works midstream because the LLM is no longer response to what previously was an acceptable input?
Weird stuff, huh? This is the world we live in now.
For this reason, companies like OpenAI are taking snapshots of their LLMs so that you can go back in time, so to speak, and utilize particular historical inputs. But it costs money to maintain these snapshot so we cannot depend on them existing for the long term.
Interesting article. I’d go so far as to say that all biological entities are a machine of some type. A computational, entropy-avoidant machine. We just happen to be filled with fluids other than oil. Altho we DO have some oils internally, too — we just call them lipids. Human, machine, potato, tomato.
Article on declining accuracy of AI this spring/summer: https://gizmodo.com/study-finds-chatgpt-capabilities-are-getting-worse-1850655728?utm_campaign=mb&utm_medium=newsletter&utm_source=morning_brew
Thanks for bring this into the conversation. I will post the original article and a better response below.
The crux here is that LLMs change over time. Therefore, the same input words do not yield the same exact outputs over time. Competencies don't necessarily change --- but the LLMs requires different inputs in order to get the same results --- and it is up to humans to figure that out.
The issue for schools is: What is your streamline a research protocol for students --- and a search process no longer works midstream because the LLM is no longer response to what previously was an acceptable input?
Weird stuff, huh? This is the world we live in now.
For this reason, companies like OpenAI are taking snapshots of their LLMs so that you can go back in time, so to speak, and utilize particular historical inputs. But it costs money to maintain these snapshot so we cannot depend on them existing for the long term.
Check out these links:
https://arxiv.org/pdf/2307.09009.pdf
https://open.substack.com/pub/aisnakeoil/p/is-gpt-4-getting-worse-over-time?utm_campaign=post&utm_medium=web
Interesting article. I’d go so far as to say that all biological entities are a machine of some type. A computational, entropy-avoidant machine. We just happen to be filled with fluids other than oil. Altho we DO have some oils internally, too — we just call them lipids. Human, machine, potato, tomato.