SAIL: Broadening Learning, ELAI, Safetey, Regulation
Welcome to Sensemaking, AI, and Learning (SAIL), a regular look at how AI is impacting learning.
Our education system has a uni-dimensional focus: learning things. Of course, we say we care about developing the whole learner, but the metrics that matter (grade, transcripts) that underpin the education system are largely focused on teaching students things that have long been Google-able but are now increasingly doable by AI. Developments in AI matters in ways that calls into question large parts of what happens in our universities. This is not a statement that people don't need to learn core concepts and skills. My point is that the fulcrum of learning has shifted. Knowing things will continue to matter less and less going forward as AI improves its capabilities. We'll need to start intentionally developing broader and broader attributes of learners: metacognition, wellness, affect, social engagement, etc. Education will continue to shift toward human skills and away from primary assessment of knowledge gains disconnected from skills and practice and ways of being.
AI and Learning
We are four weeks away from our Empowering Learners for the Age of AI conference at ASU. It's shaping up to be a fantastic event - outstanding speakers and provocative panels.
Well, apparently we don't feel the same pressure from robots as we do from humans. "When humans and robots work redundantly on a task, this can lead to motivational losses for the human team partner and make effects such as social loafing more likely."
The intertwined history of AI and Education. "I argue that the fields of artificial intelligence (AI) and education have been deeply intertwined since the early days of AI."
Artificial intelligence in higher education: trick or treat? "as a leader at your institution or organization, how should you be thinking about the potential or AI as one of the items in your “bag of tricks” to achieve better institutional outcomes and efficiencies at your organization?"
AI and Humanity
Artificial General Intelligence is already here. Why the reluctance to acknowledge it?
AI has enormous environmental implications. "ChatGPT gulps up 500 milliliters of water (close to what’s in a 16-ounce water bottle) every time you ask it a series of between 5 to 50 prompts or questions."..."In July 2022, the month before OpenAI says it completed its training of GPT-4, Microsoft pumped in about 11.5 million gallons of water to its cluster of Iowa data centers, according to the West Des Moines Water Works. That amounted to about 6% of all the water used in the district, which also supplies drinking water to the city’s residents."
The future of AI is GOMA. Big tech dominates. I'm reminded of Fei Fei Li's comment that all US universities combined didn't have the compute power (and I'll add, vision and will) to create ChatGPT. We (higher education) are not in the arena. Universities need to start thinking as a collective to engage in this opportunity. Training PhDs to work in big tech is not really owning our (university's) future.
AI and Legal
It's been a busy week for AI regulation (or moves toward some level of regulation):
Biden signs an executive order, focused on safety and preserving American's privacy. It will have implications for how AI is developed and how data is used. Notable that there is a focus to "accelerate hiring" of AI talent as part of "a government-wide AI talent surge"
In UK, the Bletchley Declaration was signed this week by 28 countries: "We resolve to work together in an inclusive manner to ensure human-centric, trustworthy and responsible AI that is safe, and supports the good of all through existing international fora and other relevant initiatives, to promote cooperation to address the broad range of risks posed by AI." Of course, each country's intelligence and military communities will be actively building AI behind the scenes - the stakes are too high to self-regulate. This is theatre.
Andrew Ng debunks fear of AI, saying it's an attempt by big tech, who is already well on their way, to prevent others from catching up (i.e. open source communities