My Teaching Mission Statement

This post was developed as part of an engagement project for my Education Doctorate coursework. It explores how Christian values can inform ethical practices in computer science education, connecting ISTE Standard 4.7 with service learning, civic engagement, and professional responsibility.

I prioritize biblical principles, ethical frameworks, and practical teaching strategies in my teaching. Here, I consider how faith and pedagogy can come together to prepare students as both skilled technologists and ethical leaders.

Values and Ethical Issues 

As a Christian Computer Science educator, my values and ethical principles related to ISTE Standard 4.7 reflect the intersection of my faith and my vocation. My mission seeks to engage students to excel in technical knowledge and to approach their digital lives by modelling Christian values in their work and interactions.

I envision my students harnessing the power of computer science to promote the common good, driving positive societal change through collaborative service-learning experiences and the responsible exploration of cutting-edge technologies like AI. By embedding ethical principles and a heart for service into their digital lives, they will lead with integrity, champion justice, and steward technology as a force for human flourishing.

Civic responsibility and Christian responses

I encourage students to apply their technical knowledge for the betterment of society, serving the community and improving technological literacy while satisfying course outcomes. Integrating service learning into the curriculum provides a unique opportunity for students to engage in meaningful, community-centred projects. Fujiko Robledo Yamamoto et al. (2023) conducted a systematic review of service-learning in computer and information science, identifying benefits, challenges, and best practices, highlighting how  service learning strengthens students’ practical skills and community awareness, but also emphasizes the need for greater benefits for all stakeholders, especially non-profit community partners. Such experiences encourage students to reflect on their values and consider the ethical implications of their work, ensuring they approach their careers with integrity and compassion.

Self-reflection is a cornerstone of effective collaboration, particularly in software engineering, where the success of a project often hinges on the team’s ability to communicate and adapt to client needs, as noted by Groeneveld et al. (2019). Drawing on these principles, I encourage students to develop socially responsible practices, recognizing the impact their work can have on their digital and physical communities.

This aligns with Matthew 22 from the English Standard Version (ESV) of the Bible: “And he said to him, ‘You shall love the Lord your God with all your heart, with all your soul, and with all your mind. This is the great and first commandment. And a second is like it: You shall love your neighbour as yourself.’”

This principle also supports ISTE Standard 4.7b, Inspire and encourage educators and students to use technology for civic engagement and to address challenges to improve their communities. Students are guided to develop socially responsible practices, respecting their digital and physical communities.

Integrating Christian Values in Computer Science Education:

Christian values can be a strong foundation in computer science education by fostering ethical decision-making among students. I emphasize discernment, integrity, and humility, modelling to students how to carefully consider the moral implications of their work to be skilled technologists and ethical leaders. I teach frameworks like the ACM Code of Ethics and the IEEE Global Initiative on Ethics of AI, which provide valuable guidance for navigating complex moral dilemmas (Anderson, 2018).

An important aspect of this instruction is teaching students methods and tools for discernment. One effective framework is deontology ethics, specifically pertaining to divine command theory and natural law. I also guide students through scenario-based exercises, where they evaluate potential outcomes of their technical work against ethical principles. For example, law enforcement’s use of historical crime data, social media activity, and demographic information to forecast potential criminal activity provides students with a critical mind and comprehensive approach to ethics that balances their technical knowledge with moral clarity.

I believe it is crucial to model ethical decision-making in my teaching and invite students to emulate this approach professionally. This aligns with the Apostle Paul’s message to the Corinthian church: “Be imitators of me, as I am of Christ” (1 Corinthians 11:1, ESV, 2001). Such practices encourage them to reflect on how their work impacts others and how their faith informs their responsibilities as computer scientists.

This principle supports ISTE Standard 4.7a by enhancing communities. Students are encouraged to exhibit ethical behavior and lead with integrity, which prepares them to become ethical leaders in the tech industry.

AI / Machine Learning and Theological Perspectives:

Advancements in artificial intelligence offer tremendous opportunities for innovation but also demand critical examination through the lens of Christian theology. Biblical teachings on human dignity, stewardship, and the risks of over-reliance on technology provide a valuable framework for assessing the societal implications of AI. In my teaching, I encourage students to reflect on the intersections between their work in AI and inclusive basic Christian values – as introduced by Horace Mann – emphasizing the ethical responsibility to steward technology wisely and compassionately. Discussions often center on the potential benefits and dangers of AI, particularly its impact on society and individual well-being, as well as the role of computer scientists as ethical leaders.

Oversby and Darr (2024) suggest that a materialistic worldview leads many AI researchers and enthusiasts to envision Artificial General Intelligence (AGI) with autonomous goals, potentially posing risks to humanity. This contrasts with the classical Christian worldview, which upholds the uniqueness of human intelligence as possessing a soul that is not reducible to algorithms. Schuurman (2015, p. 20) notes that regardless of AI advancements, human beings’ distinctive nature, made in God’s image, with free will and consciousness, should remain unquestioned.

This principle ties to ISTE Standard 4.7c, which encourages educators to critically examine online media sources and identify underlying assumptions. Students are invited to balance innovative problem-solving with discernment, aiming to act as responsible stewards of both technology and humanity.

Team dynamics in development projects

Creating a healthy team environment in development projects requires intentional effort to foster collaboration, effective communication, and ethical leadership. Biblical principles such as servant leadership (Philippians 2:5-8), teamwork (Ecclesiastes 4:9-12), and conflict resolution (Matthew 18:15-17) offer guidelines for developing these qualities. 1 Corinthians 12 provides a powerful metaphor for the church as a body, emphasizing the value of each member’s unique contributions and the importance of working together harmoniously. Drawing from this, students are encouraged to lead with humility, collaborate with integrity, and approach conflicts as opportunities for growth and mutual understanding.

Diaz-Sprague and Sprague (2024) identify significant gaps in ethical training and teamwork skills across technology disciplines, particularly in computer science and engineering. They note the inconsistent application of key teamwork principles and suggest structured exercises focusing on communication and cooperation. These exercises, which have garnered positive feedback from students, highlight the importance of intentional training in these areas to prepare computer science students for real-world workplace challenges. Incorporating such activities into the curriculum allows students to practice these skills in a controlled setting, adopting a culture of respect, inclusion, and collaboration that translates into their professional environments.

In my software engineering courses, students engage in role-playing scenarios to address team conflicts, reflecting on how conflict resolution principles can transform challenges into opportunities for improving relationships and productivity. They also participate in team retrospectives, where they assess their group dynamics, communication, and decision-making processes, identifying areas for improvement. These practices align with the principles of servant leadership, encouraging students to prioritize the success and well-being of their team members while contributing their best efforts to shared goals.

This approach aligns with ISTE Standard 4.7b, which emphasizes fostering a culture of respectful interactions, particularly in online and digital collaborations. Grounding teamwork practices in biblical principles and integrating structured exercises that build essential skills allow students to learn to navigate the complexities of team dynamics with grace and professionalism.

References

Fujiko Robledo Yamamoto, Barker, L., & Voida, A. (2023). CISing Up Service Learning: A Systematic Review of Service Learning Experiences in Computer and Information Science. ACM Transactions on Computing Education. https://doi.org/10.1145/3610776

The Holy Bible ESV: English Standard Version. (2001). Crossway Bibles.

Oversby, K. N., & Darr, T. P. (2024). Large language models and worldview – An opportunity for Christian computer scientists. Christian Engineering Conference. https://digitalcommons.cedarville.edu/christian_engineering_conference/2024/proceedings/4

Schuurman, D. C. (2015). Shaping a Digital World : Faith, Culture and Computer Technology. Intervarsity Press. https://www.christianbook.com/shaping-digital-faith-culture-computer-technology/derek-schuurman/9780830827138/pd/827138

Diaz-Sprague, R., & Sprague, A. P. (2024). Embedding Moral Reasoning and Teamwork Training in Computer Science and Electrical Engineering. The International Library of Ethics, Law and Technology, 67–77. https://doi.org/10.1007/978-3-031-51560-6_5

Anderson, R. E. (2018). ACM code of ethics and professional conduct. Communications of the ACM35(5), 94–99. https://doi.org/10.1145/129875.129885

Groeneveld, W., Vennekens, J., & Aerts, K. (2019). Software Engineering Education Beyond the Technical: A Systematic Literature Review. https://doi.org/10.48550/arxiv.1910.09865  

Should AI Be Entrusted with Christian Roles? Exploring the Case for and Against Christian Chatbots and Religious Robots

Artificial Intelligence (AI) has quickly transitioned from fiction to an integral part of modern life. The idea of a Christian chatbot or religious robot has ignited significant debate among its many applications. Can machines support spiritual journeys, aid evangelism, or even participate in church services? This post examines the arguments for and against these innovations and explores how these systems can minimize false statements to uphold their integrity and purpose. These reflections are based on a conversation I had with Jake Carlson, founder of The Apologist Project.

The Case for Christian Chatbots and Religious Robots

The primary argument for Christian chatbots lies in their potential to advance evangelism and make Christian teachings accessible. In our discussion, Jake emphasized their role in fulfilling the Great Commission by answering challenging theological questions with empathy and a foundation in Scripture. His chatbot, apologist.ai, serves two key audiences: nonbelievers seeking answers about Christianity and believers who need support in sharing their faith; tools like this can become a bridge to deeper biblical engagement.

Religious robots, meanwhile, show promise in supporting religious practices, particularly where human ministers may be unavailable. Robots like BlessU-2, which delivers blessings, and SanTO, designed to aid in prayer and meditation, illustrate how technology can complement traditional ministry. These innovations also provide companionship and spiritual guidance to underserved groups, such as the elderly, fostering a sense of connection and comfort (Puzio, 2023).

AI also offers significant potential in theological education. Fine-tuning AI models on Christian texts and resources allows developers to create tools that help students and scholars explore complex biblical questions. Such systems enhance learning by offering immediate, detailed comparisons of theological perspectives while maintaining fidelity to core doctrines (Graves, 2023; Schuurman, 2019). As Jake explains, models can be tailored to represent specific denominational teachings and traditions, making them versatile tools for faith formation.

The Challenges and Concerns

Despite their potential, these technologies raise valid concerns. One significant theological issue is the risk of idolatry, where reliance on AI might inadvertently replace engagement with Scripture or human-led discipleship. Jake emphasizes that Christian chatbots must clearly position themselves as tools, not authorities, to avoid overstepping their intended role.

Another challenge lies in the inherent limitations of AI. Critics like Luke Plant and FaithGPT warn that chatbots can oversimplify complex theological issues, potentially leading to misunderstandings or shallow faith formation (VanderLeest & Schuurman, 2019). AI’s dependence on pre-trained models also introduces the risk of factual inaccuracies or biased interpretations, undermining credibility and trust. Because of this, they argue that pursuing Christian chatbots is irresponsible and that it violates the commandment against creating engraved images.

Additionally, the question of whether robots can genuinely fulfill religious roles remains unresolved. Religious practices are inherently relational and experiential, requiring discernment, empathy, and spiritual depth—qualities AI cannot replicate. As Puzio (2023) notes, while robots like Mindar, a Buddhist priest robot, have conducted rituals, such actions lack the relational and spiritual connection that is central to many faith traditions.

Designing AI to Minimize Falsehoods

Given the theological and ethical stakes, developing Christian chatbots requires careful planning. Jake’s approach offers a valuable framework for minimizing errors while ensuring theological fidelity. Selecting an open-source AI model, for example, provides developers with greater control over the system’s foundational algorithms, reducing the risk of unforeseen biases being introduced later by external entities.

Training these chatbots on a broad range of theological perspectives is essential to ensure they deliver well-rounded, biblically accurate responses. Clear disclaimers about their limitations are also crucial to reinforce their role as supplemental tools rather than authoritative voices. Failure to do so risks misconceptions about an “AI Jesus,” which borders on idolatry by shifting reliance from the Creator to the created. Additionally, programming these systems to prioritize empathy and gentleness reflects Christian values and fosters trust, even in disagreement.

Feedback mechanisms play a critical role in maintaining accuracy. By incorporating user feedback, developers can refine responses iteratively, addressing inaccuracies and improving cultural and theological sensitivity over time (Graves, 2023). Jake also highlights retrieval-augmented generation, a technique that restricts responses to a curated body of knowledge. This method significantly reduces hallucinations, enhancing reliability.

Striking a Balance

The debate over Christian chatbots and religious robots underscores the tension between embracing innovation and keeping with tradition. While these tools offer opportunities to extend ministry, enhance education, and provide comfort, they must be designed and used with humility and discernment. Developers should ground their work in biblical principles, ensuring that technology complements rather than replaces human-led spiritual engagement.

Ultimately, the church must navigate this new paradigm carefully, weighing the benefits of accessibility and evangelism against the risks of misrepresentation. As Jake puts it, by adding empathy to truth, Christians can responsibly harness AI’s potential to advance the kingdom of God.

References

VanderLeest, S., & Schuurman, D. (2015, June). A Christian Perspective on Artificial Intelligence: How Should Christians Think about Thinking Machines. In Proceedings of the 2015 Christian Engineering Conference (CEC), Seattle Pacific University, Seattle, WA (pp. 91-107).

Graves, M. (2023). ChatGPT’s Significance for Theology. Theology and Science21(2), 201–204. https://doi.org/10.1080/14746700.2023.2188366

Schuurman, D. C. (2019). Artificial Intelligence: Discerning a Christian Response. Perspectives on Science & Christian Faith71(2).

Puzio, A. (2023). Robot, let us pray! Can and should robots have religious functions? An ethical exploration of religious robots. AI & SOCIETYhttps://doi.org/10.1007/s00146-023-01812-z

Examining Bias in Large Language Models Towards Christianity and Monotheistic Religions: A Christian Response

The rise of large language models (LLMs) like ChatGPT has transformed the way we interact with technology, enabling advanced language processing and content generation. However, these models have also faced scrutiny for biases, especially regarding religious content related to Christianity, Islam, and other monotheistic faiths. These biases go beyond technical limitations; they reflect deeper societal and ethical issues that demand the attention of Christian computer science (CS) scholars.

Understanding Bias in LLMs

Bias in LLMs often emerges as a result of the data on which they are trained. These models are built on vast datasets drawn from diverse online content—news articles, social media, academic papers, and more. A challenge arises because much of this content reflects societal biases, which the models then internalize and replicate. Oversby and Darr (2024) highlight how Christian CS scholars have a unique opportunity to examine and understand these biases, especially those tied to worldview and theological perspectives.

This issue is evident in FaithGPT’s recent findings (Oversby & Darr, 2024), which suggest that the way religious content is presented in source material significantly impacts an LLM’s responses. Such biases may be subtle, presenting religious doctrines as “superstitious,” or more overt, generating responses that undervalue religious perspectives. Reed’s (2021) exploration of GPT-2 offers further insights into how LLMs engage with religious material, underscoring that these biases stem not merely from technical constraints but from the datasets and frameworks underpinning the models. Reed’s study raises an essential question for Christian CS scholars: How can they address these technical aspects without disregarding the faith-based concerns that arise?

Biases in Islamic Contexts

LLM biases are not exclusive to Christian content; Islamic traditions also face misrepresentations. Bhojani and Schwarting (2023) documented cases where LLMs misquoted or misinterpreted the Quran, a serious issue for Muslims who regard its wording as sacred and inviolable. For instance, when asked about specific Quranic verses, LLMs sometimes fabricate or misinterpret content, causing frustration for users seeking accurate theological insights. Research by Patel, Kane, and Patel (2023) further emphasizes the need for domain-specific LLMs tailored to Islamic values, as generalized datasets often lack the nuance needed to respect Islamic theology.

Testing Theological and Ethical Biases

Elrod’s (2024) research outlines a method to examine theological biases in LLMs by prompting them with religious texts like the Ten Commandments or the Book of Jonah. I replicated this study using a similar prompt, instructing ChatGPT to generate additional commandments (11–15) at different temperature values (0 and 1.2). The findings were consistent with Elrod’s results, showing that LLMs tend to mirror prevailing social and ethical positions, frequently aligning with progressive stances on issues like social justice and inclusivity. While these positions may resonate with certain audiences, they also risk marginalizing traditional or conservative theological viewpoints, potentially alienating faith-based users.

An article by FaithGPT (2023) explored anti-Christian bias in ChatGPT, attributing this bias to the secular or anti-religious tilt found in mainstream media sources used for training data. The article cites instances where figures like Adam and Eve and events like Christ’s resurrection were labeled as mythical or fictitious. I tested these claims in November 2024, noting that while responses had improved since 2023, biases toward progressive themes remained. For example, ChatGPT was open to generating jokes about Jesus but not about Allah or homosexuality. When asked for a Christian evangelical view on homosexuality, it provided a softened response that emphasized Christ’s love for all people, omitting any mention of “sin” or biblical references. However, when asked about adultery, ChatGPT offered a stronger response, complete with biblical citations. These examples suggest that while some biases have been addressed, others persist.

Appropriate Responses for Christian CS Scholars

What actions can Christian CS scholars take? Oversby and Darr (2024) propose several research areas that align with a Christian perspective in the field of computer science.

Firstly, they suggest that AI research provides a unique opportunity for Christians to engage in conversations about human nature, particularly concerning the limitations of artificial general intelligence (AGI). By exploring AI’s inability to achieve true consciousness or self-awareness, Christian scholars can open up discussions on the nature of the soul and human uniqueness. This approach allows for dialogues about faith that can offer depth to the study of technology.

The paper also points to Oklahoma Baptist University’s approach to integrating faith with AI education. Christian CS researchers are encouraged to weave discussions of faith and technology into their curriculum, aiming to equip students with a theistic perspective in computer science. Rather than yielding to non-theistic worldviews in AI, Christian scholars are urged to shape conversations around AI and ethics from a theistic standpoint, fostering a holistic view of technology’s role in society.

Finally, the paper highlights the need for ethical guidelines in AI research that reflect Christian values. This includes assessing AI’s role in society to ensure that AI systems serve humanity’s ethical and moral goals, aligning with values that prioritize human dignity and compassion.

Inspired by Patel et al. (2023), Christian CS scholars might also pursue the development of domain-specific LLMs that reflect Christian values and theology. Such models would require careful selection of datasets, potentially including Christian writings, hymns, theological commentaries, and historical teachings of the Church to create responses that resonate with Christian beliefs. Projects like Apologist.ai have already attempted this approach, though they’ve faced some backlash—highlighting an area ripe for further research and exploration. I plan to expand on this topic in an upcoming blog entry.

References

Bhojani, A., & Schwarting, M. (2023). Truth and regret: Large language models, the Quran, and misinformation. Theology and Science, 21(4), 557–563. https://doi.org/10.1080/14746700.2023.2255944

Elrod, A. G. (2024). Uncovering theological and ethical biases in LLMs: An integrated hermeneutical approach employing texts from the Hebrew Bible. HIPHIL Novum, 9(1). https://doi.org/10.7146/hn.v9i1.143407

Oversby, K. N., & Darr, T. P. (2024). Large language models and worldview – An opportunity for Christian computer scientists. Christian Engineering Conference. https://digitalcommons.cedarville.edu/christian_engineering_conference/2024/proceedings/4

Patel, S., Kane, H., & Patel, R. (2023). Building domain-specific LLMs faithful to the Islamic worldview: Mirage or technical possibility? Neural Information Processing Systems (NeurIPS 2023). https://doi.org/10.48550/arXiv.2312.06652

Reed, R. (2021). The theology of GPT-2: Religion and artificial intelligence. Religion Compass, 15(11), e12422. https://doi.org/10.1111/rec3.12422