Media and digital literacies in Canadian teacher educators’ open educational practices: A post-intentional phenomenologyMain MenuWelcomethis is the starting point and gateway into this PhD dissertation documentBeginninggeneral introduction to this dissertationLiterature Reviewintroduction to the literature review and outlines purpose for theoretical and conceptual frameworksResearch Designoutlines the sequence for the research design - methodology, methods, validity, ethicsData Analysis and Findingsthis is the introduction to the data analysis chapter of the dissertationDiscussionthis is the discussion of the data and analysis reflecting the research conducted for this dissertationConclusionthis page introduces the conclusion of this dissertationReferencesthis is a page describing how the references are organized in this documentAppendiceslist and links to appendices in this dissertationhjdewaardc6c8628c72182a103f1a39a3b1e6de4bc774ea06HJ.DeWaard
Black Box - glossary item
12023-06-25T18:42:06+00:00hjdewaardc6c8628c72182a103f1a39a3b1e6de4bc774ea0622defines and describes this conceptplain2023-06-27T10:39:10+00:00hjdewaardc6c8628c72182a103f1a39a3b1e6de4bc774ea06When considering technologies and the artifacts created using electronic technologies, there are elements that are hidden and unseen. They occur, as if there is a 'black box' taking input from one side of the box, completing some hidden operation on that input, and creating some form of output. These 'black box' processes, organizations, affordances, and experiences determine how human-technology interactions occur or are sustained. Kallinikos (2002) explored the issues relating to the "internal constitution of particular technologies" and the hidden premises that "govern the human- technology interaction" (p. 290). Kallinkos (2002) argued that "once constructed, such forms matter and they matter a lot. The closure or openness of artifacts and the premises by which they admit human participation are heavily contingent on the way they are internally organized as systems" (p. 290.
Reference Kallinikos, J. (2002). Reopening the black box of technology artifacts and human agency. International Conference on Information Systems, Barcelona, Spain. http://eprints.lse.ac.uk/190/1/Kallinikos-ICIS.pdf
This page is referenced by:
12022-06-04T15:43:12+00:00Glossary17alphabetic listing of glossary items with links to notes that describe each itemplain2023-06-27T12:14:19+00:00Here is an alphabetic listing of the glossary items included in this dissertation document. Each item is linked to a note where the item is defined, described, and/or examples provided. These glossary items are also embedded throughout the document as notes within pages, where they provide 'just in time' clarification for you, the reader.
Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans
UNESCO
Uniform Resource Locator (URL)
Universal Serial Bus (USB)
Visitors / Residents
12023-04-24T16:51:24+00:00Dimension 4.33discussion of dimension 4.3: criticality in dataficationplain2023-06-25T18:52:46+00:00
Criticality in datafication
Data literacies and datafication are not new to education. Historically, data collection and analysis at the school or system level focus on attendance and behavioural data (Selwyn et al., 2023). Teachers focus on the collection and curation of information and data in efforts to learn more about their students in order to plan their teaching and target student learning needs. Data collection at the educator level is in part a way to build relationships, by getting to know and understand details about students’ lives and interests, but also to examine, analyze, visualize and profile student progress and skill development (Selwyn et al., 2023). In considering the definition of algorithms as systematic, step-by-step methods of solving a certain kind of problem or of representing a procedure (Danesi, 2002, p 12), teachers and teacher educators tacitly understand how algorithms apply to teaching practices. For example, consider how teachers deconstruct learning into component parts to ensure a logical scaffold from one skill to the next, particularly for complex learning tasks. Data literacies in the hands of individual educators, such as the participants in this research, focus on datification in service of the needs of students, as exemplified by CSs’ question “who does it serve?”
Current education systems at the organizational level are now driven by the perceived need for collecting ‘big-data’, yet using algorithmic logic, and opaque and discriminatory practices (Stewart & Lyons, 2021) can result from datafication. Systems that turn socio-cultural learning into “quantifiable, extractable data” through the “proliferation of platform logics” (Nichols et al., 2021, p. 346) potentially turning teaching and learning activities into opportunities for surveillance and data extraction (Pangrazio & Selwyn, 2019; Stornaiuolo, 2019). A critical literacy for educators is found within this need to turn toward “particular technical and economic substrates at work in digital platforms” (Nichols et al., 2021, p. 347). Thus, critical examination of datafication should be considered, in light of the push to "centralize user-data, erode expectations for privacy, and expand state and corporate mechanisms for raced and classed surveillance” (Nichols et al., 2021, p. 346). This critical perspective is evident in the participants’ lived experiences. For example, in UF’s connection to data footprints and who owns the information students generate, in SH’s comments on the risks and rewards when using educational technology systems, and in ER’s lived experiences in questioning how Indigenous knowledge that belongs to the community can be shared within anti-capitalistic notions of knowledge acquisition.
Criticality in datafication is suggested by Selwyn et al., (2023) as a starting point for teacher inquiry and professional judgement. This is echoed in the current rush for educators, and thus TEds, to put data literacies at the forefront of their teaching practice (Raffeghelli, 2022; Raffeghelli & Stewart, 2020; Stewart & Lyons, 2021). Participants share feelings of responsibility when designing learning activities for students that encourage or expose them to predatory media production tools. Awareness of issues relating to privacy and data extraction is evident in the interview responses, focusing more on technical considerations “relevant to read, manage, process and visualize data, and interact with algorithms” (Raffeghelli, 2022, p. 82). There is some suggestion that participants are moving toward proactive practices with data literacies that include students in the conversations relating to their data usage (Atena et al., 2020; Raggeghelli, 2022) since there is a need for “criticality of pedagogical praxis in the face of automation and AI in teaching” (Gallagher et al., 2021 p. 427). Shifting the students’ role from passive participant in the hidden curriculum surrounding data extraction (Selwyn et al., 2023) is evident in LV’s question of who owns student data, or in CSs’ questions about TCs taking pictures in a classroom with a cell phone and knowing where those images may be stored.
Criticality in MDL within an OEPr includes pushing toward opening datafication mechanisms to model transparency, particularly when LMS or research publications are located within black boxed or paywalled locations. Participants mention this initiative within their OEPr in order to present learning to audiences beyond the confines of their course. They reflect in their lived experiences of additional challenges, particularly with data management, when course work is shared within open and public knowledge spaces such as the OER-Commons or Wikipedia, as a way to mobilize learning for the greater good of world knowledge. Participants report a process of continual negotiation and critical intentionality while considering their own and their student’s safety, security, privacy, and permissions (SSPP).