After Research: The Challenge of Information Literacy in a Post-Truth Culture


Within information behavior, an emerging trend is practices based on the assumption that objective research is not possible, especially within the context of digitally-mediated information. In qualitative research and interviews about controversial subjects in 2020, I found that Christians who considered themselves conservative routinely disputed the possibility of performing objective research on the veracity of online sources. This contention was often fueled by reflections on Google, which is increasingly perceived to be the gatekeeper of all information. There is widespread belief that all information is filtered through the popular search engine, and that objective source evaluation and fact-checking are therefore no longer possible. Those who took this position appealed to a range of heuristics to overcome the perceived limitations that Google imposes on information acquisition and evaluation. These heuristics included the use of credibility and authority as surrogates for critical thinking, in addition to various emotion-based verification criteria. This article explores the challenges and opportunities these trends create for librarians working at Christian institutions, with some proposals for further areas of research and action.

Challenges Facing Librarians

“It’s hard to teach students to apply critical thinking to source evaluation, because they increasingly assume that all information is biased.”

The above comment was made by a librarian in a workshop I attended at the annual convention of Pacific Northwest Library Association. This librarian, whose job consisted of teaching information literacy to freshmen, went on to explain that students frequently justify using unreliable web sources because “all information is biased anyway.” On this way of thinking, the difference between real news and fake news is just a point of view.

This librarian was not alone in his observation. Over the last four years, various cultural commentators have noted that we are entering a post-truth culture in which the very concept of factuality comes to be contested (McIntyre, 2018). Whereas lawmakers and pundits once occupied themselves with disputes over facts, they now increasingly appeal to “alternative facts,” or relativistic tropes such as “you have your facts, and I have my facts.”

It is troubling to see research showing that students increasingly cannot tell the difference between real and fake news (Domonoske, 2016), and perhaps even more troubling when students do not even care about knowing the difference, or dispute that there even is a difference between what is real and what is fake.

Authority and Trustworthiness as Surrogates for Critical Thinking

Widespread skepticism about the possibility of objective research has, as one of its correlates, a growing reliance on authority and credibility as surrogates for reason and critical thinking within the process of information retrieval and evaluation. We know from survey data that individuals are likely to assign “trustworthiness” to sources that echo their own perspectives. Lee McIntyre reports:

“In a 2014 Pew survey that asked Americans to name their ‘most trusted’ news source, there was a predictable partisan split. Among self-identified conservatives, Fox News led with 44 percent. With liberals, it was network broadcasts news at 24 percent and a more or less three-way tie for second place between public television, CNN, and Jon Stewart’s The Daily Show” (McIntyre, 2018, p. 73).

An individual who trusts a certain outlet as being reliable will be disincentivized from performing due diligence on the outlet’s claims even when those claims may be controversial or contested, because the person will feel justified in simply assuming the information is true on the basis of the credibility he or she has assigned to the outlet. The question is not, “How do you know that is true?” but “Did you hear that on CNN or Fox?”

The notion that authority and credibility have become surrogates for critical analysis was impressed upon me last year shortly after the aforementioned library conference when I was involved in an online discussion about COVID-19. That was in the early days of the pandemic when there was still widespread skepticism concerning the reality of the virus. One skeptic told me that questions about the existence or nonexistence of COVID-19 boil down to who you trust. I responded by pointing out that “we can exercise our own critical thinking and research skills and then make a well-informed judgment that does not involve putting blind faith in any organization or body.” As a grad student of Library Science specializing in information literacy, I did not think I had made a particularly controversial statement. After all, part of the standard Library and Information Science (LIS) syllabi include courses in critical thinking and best practices for source evaluation. (I have aggregated many of these best practices into my YouTube tutorials and articles.) Yet in our post-truth culture, the very suggestion that critical thinking can assist in objective source evaluation is itself hotly contested. If all information is biased and agenda-driven, then source evaluation skills and critical thinking must simply be proxies for one’s own bias, or so the narrative goes. This assumption was reflected in the gentleman’s response: “Your solution to sifting out the truth is to take your own counsel, and to trust your own judgment.” On his way of thinking, the skills-based approach that I had referred to in my comment was simply a post hoc rationalization for trusting my own judgment. Instead, this gentleman (and others) told me, we should look at where a person is coming from, and whether their ideological affiliations render them trustworthy. The entire discipline of information literacy—together with best practices for using critical thinking for source evaluation—simply masks the bottom line: we all have to simply make a decision about which information sources to trust.

My Due Diligence Experiment

Hoping to discover if my exchange with the aforementioned gentleman was an anomaly, I engaged in some qualitative research. Unbeknownst to my friends, I made all of them my test subjects.

Throughout 2020, there were numerous controversies surrounding COVID-19, the presidential election and spin-off controversies, and various conspiracy theories. I was part of a community in North Idaho that took a keen interest in these issues, often through the lens of alternative news sources. For my experiment, every time a friend sent me a link to a controversial news item, I would respond as follows and then see what happened:

“Thank you for sharing this! I will be initiating a process of due diligence on this source before offering any comment. If you would like to be included in the process of due diligence, please let me know.”

In offering this reply, I wanted to find out how people would respond to the idea of performing due diligence on information sources instead of just accepting sources that felt right. If my hypothesis was correct, the very idea of performing due diligence would be perceived as naïve, or even as a capitulation to “the other side.”

In the volatile political environment leading up to the 2020 election, I received lots of links from well-meaning friends. But whether the link was to a story about QAnon, theories about the origin of the pandemic, or a meme claiming that Bill Gates created COVID-19, I would reply with the aforementioned response, informing my friend that I was performing due diligence and inviting him or her to be involved in the process.

I should say, on the outset, that my sample base was not randomized. The group of friends I was interacting with were all socially, politically, and religiously conservative. They would all strongly repudiate relativism and would defend objective truth with the same tenacity as they would defend the right to bear arms or the necessity of opposing abortion. This makes the results of my study all the more ironic. There was not a single case where one of my interlocutors responded positively after learning that I was performing due diligence on the story/information they had sent. In many cases the person registered implicit or explicit hostility. Significantly—and this is where the irony comes in—this hostility often hinged on the claim that there is no objective standard in which to ground due diligence and fact-checking.

On a few occasions I responded to push-back by sharing an article I had written about the use of logic and critical thinking in the fact-checking process. One gentleman, a staunch conservative and defender of traditional values, replied to my article about critical thinking by saying, “To me it seems a lot of time wasted. Using common sense will give you the answer quicker.” He went on to explain that everyone must simply use his or her own private judgment, because no one can judge for another what information is reliable and what is not. At the end of the day, we must each rely on common sense and our personal BS-sensor. In one of his many messages to me, he wrote,

“I told you about checking anything. You can’t. Media is serving the deep state. The process should be very simple. You can use your judgement. If it makes sense it is real. Anything you try to get from Google is not reliable. Fact check is not what used to be.

Otherwise why they will [sic] censor people?… Who are we to judge for others…. Fact check is not what it used to be….Spending huge amounts of time in researching it will not help… I told you about checking anything..”

These words—which might be multiplied through additional case-studies—are significant because they show that skepticism about the possibility of objective research does not always arise from the obvious ideological origins we might expect, such as philosophical relativism, postmodern skepticism, or epistemological subjectivism. Rather, skepticism concerning the possibility of objective research often arises from concern that all information is biased, along with belief that entities with sinister goals have polluted the water at its source through controlling our access to information. According to this narrative, all our access to information is mediated by the Deep State, and therefore fact-checking, research, and due-diligence recede into anachronism. At the heart of this skepticism is a belief that Google has become a sinister gatekeeper of all information.

It will be helpful to look more closely at how people’s perceptions of Google have been fueling this skepticism.

Google-Fueled Skepticism

As already mentioned, the specter of Google loomed large throughout my experiment. In many spin-off conversations, both in brief text exchanges as well as in-person dialogues, individuals repeatedly shared concern that Google had polluted the informational environment; consequently, practices such as objective fact-checking and source evaluation are no longer possible within digital environments. Often my friends conveyed this by putting scare quotes around “fact-checking.” One gentleman was so concerned about my approach to fact-checking that he challenged me to a written debate, which we later published online. In the debate he routinely questioned the very possibility of fact-checking.

The recurring argument throughout these conversations tended to run something like the following:

  • Premise 1: Google controls our access to information and mediates their own version of reality to us.
  • Premise 2: The very concept of “fact-checking,” and “fact-checkers,” has been colonized by the unholy alliance between the Deep State and Big Tech, with Google as the primary culprit.
  • Premise 3: Given Premise 1 and 2, it follows that those who believe it is possible objectively to engage in activities like “online research,” “digital source evaluation” and “fact-checking,” are exhibiting extreme naivety.
  • Conclusion: Therefore, the best we can hope to do is employ various heuristics that enable us to use online information while bypassing Google’s control.

This argument is a synthesis and summary of argumentation that was expressed less formally through conversations and text messages. But the basic concerns expressed in Premises 1-3, together with the conclusion drawn, surfaced with remarkable consistency across my conversations.

The concerns expressed in Premise 1 are not without warrant. For years, most people’s access to information has indeed been mediated through Google, which is widely known to be biased. Throughout 2020, Google was the subject of hearings at Capitol Hill, as law makers became increasingly concerned about the power the company wielded in being able to moderate content, and to present a curated version of reality to entire populations.

I encountered Google’s bias first-hand last year when my brother Patrick gave me an opportunity to participate in a discussion for his YouTube channel about whether COVID-19 was real or not. My brother took the position that the pandemic was fake, and I argued that it was real. After the conversation was over, Patrick uploaded the video to his channel, where numerous people enjoyed watching our friendly interchange of ideas. A few weeks later, Google removed the video from YouTube for “violating community standards.”

Google’s censorship practices have been well-noted (Epstein, 2016). Not only does Google carefully curate their search results to reflect the political and philosophical bias of the company, but they customize each individual’s search results to reflect the bias and preferences unique to that individual. Every time you use the internet or your phone or PC while logged into your Gmail account, Google is harvesting your information. Gradually, Google learns what you want to see. This is why the same Google search will bring up different results on your personal computer vs. a library computer. I found this out the hard way many years ago, after deciding to perform a Google search on myself. I was impressed to see the first two and three pages of search results all about me, and nothing else. “I’m famous!” I thought. For a few days, my ego swelled, until I did the same search on someone else’s computer and found myself outranked by all the other people in the world who share my name. Google had given me what their bots knew I wanted to see!

In sum, there is no getting around the truth of Premise 1. Google does indeed offer each of us a carefully curated view of the world, one that is a combination of our own preferences combined with the company’s own economic, political, and ideological agenda. Although the exact formula the company uses for curating each person’s information is part of their proprietary algorithm, we can observe its effects.

As increasing numbers of people become alert to Google’s reach, one might expect a new wave of interest in research skills, including techniques on using the search engine more strategically. One might also expect a popular interest in alternative search engines, and the various proprietary retrieval networks used by academic researchers and available to the public at any university library. One might also expect to see people using books and periodicals to research, like they did in the days before the internet. One might also expect to see widespread interest in best practices for objectively evaluating information online. While there is no doubt that many people are moving in these directions, the people I talked to in the course of my experiment registered widespread skepticism concerning the very idea of objective evaluation and acquisition of online information.

Information Fatigue

The skepticism fueled by Google does not only arise as a result of the company’s agenda. Apart from questions of bias, Google represents the unprecedented proliferation of information. Such proliferation leaves many people feeling like they are swimming in a sea of data with a condition I call “information fatigue.”

During some interviews I encountered a type of information fatigue which I hypothesized was positively correlated with (1) information skepticism, (2) apathy about performing research to adjudicate between competing information claims, (3) antagonism towards the very idea of performing due diligence on information sources. Here is an example, from an online interaction, where information fatigue was paired with epistemological skepticism.

“There are few facts. Even data is skewed depending on the experimenter’s bias…. what is determined to be ‘fact’ is subjective…

The point I am making is that many if not most declared facts are not proven facts for most people. The majority of things stated as fact for people is merely based on who they trust in who or what they put their faith in. How is it fact for you that the color you see and call “red” is the same color I see and call ”red”? How is it fact that both your parents are your biological parents? How is it fact that your children are your biological children? How is it fact that you have never caused someone’s death? How is it fact that you are not a test tube baby? How is it fact that you have any money in the bank right now? How is it fact that that you are not in a simulation of life but really your body is in an electrolyte bath wired up for virtual reality? How is it fact that your car was actually made by Toyota?”

Notice in the above comment that the epistemological vacuum created by extreme skepticism is filled by trust, with the subject claiming that in the end we simply have to choose who to trust. We have already discussed how trust and “faith” come to be heuristics that function as surrogates for critical thinking, but it will be worthwhile to look deeper at the nature and definition of heuristics.

Heuristics as a Solution to Google-Fueled Skepticism and Information Fatigue

Throughout these various conversations, it was a curious fact that those who believed Google had rendered objective research and fact-checking impossible did not tend to be agnostic concerning the acquisition of contested knowledge, nor did they take the approach of epistemological skepticism, let alone intellectual humility. In fact, the opposite was always the case, as these individuals tended to be very dogmatic and definitive in asserting their opinions and in sharing those opinions with others. Equally ironic, those who most hotly disputed the possibility of objective online research and fact-checking did not tend to be the same group who turned away from the internet to use physical resources; rather, they tended to spend an inordinate amount of time online. How could this be? The answer is that these individuals overcame skepticism by asserting the necessity for a variety of heuristics.

The concept of heuristics has received widespread treatment within the LIS scholarship over the last ninety years. In the most basic terms, a heuristic is a “rule of thumb” method of problem-solving that may be practically effective while yielding only approximate outcomes. In the LIS literature, heuristics are often linked to a user’s desire to minimize time expenditure during information-gathering activities. The desire of users to maximize convenience is a concept that has roots in S. R. Ranganathan classic 1931 work The Five Laws of Library Science (Ranganathan, 1931). The principle of minimizing time expenditure has been described in a variety of ways, including “The Principle of Least Effort,” (Bierbaum, 1990) and “low-cost” information strategies (Schwieder, 2016). These theories often draw on Zipf’s classic work showing that people will typically choose the easiest way of accomplishing any task (Zipf, 1949). Perhaps the most familiar heuristic is scanning results on the first page of a Google query, and rarely moving past the first page (Asher et al., 2013).

These heuristics make sense within the context of rational choice theory, a framework developed in the field of economics which posits that when the benefits of a task are too low to justify high time and effort expenditure, heuristic strategies will be favored on cost-benefit grounds (Larrick et al., 1993).

In the present context, heuristics emerged out of concern that normal avenues of online research and fact-checking are no longer possible, given Google’s pollution of the informational environment. The heuristics preferred by individuals alert to Google’s distortions included various appeals to common sense, gut-instinct, and a range of emotion-based strategies which put stock in the user’s ability to discern if something “feels right.” Below is a list of many popular heuristics I encountered in my conversations. All of these are exact quotations from interviews:

  1. “Using common sense will give you the answer quicker.”
  2. “You can use your judgement. If it makes sense it is real.”
  3. “It all comes down to who you choose to believe.”
  4. “See if it feels right.”
  5. “Mostly I believe, and as things are proven in my experience, I may call them facts.”
  6. “When evaluating information, you sometimes have to have faith.”
  7. “I trust my inner BS sensor because it has a lot of experience behind it.”

These and similar approaches offer appealing short-cuts to information retrieval and source evaluation – a low-cost alternative to performing due diligence. From a cost-benefit perspective, this makes a certain sense. After all, getting a sense for whether a video or article “feels right” or “rings true,” enables a person to save hours of time painstakingly researching the source’s claims. And while these and similar heuristics would seem irrational under normal circumstances, the skepticism generated by concern about Google lends a sense of rationality to these heuristics. In fact, many of the individuals I spoke with were able to claim the high ground of rationality by suggesting that I was the irrational one by clinging to outdated concepts like objective research and fact-checking, since these behaviors reveal a naïve underestimation of Google’s control over all information.

Curiously, both those who use Google uncritically, and those who lapse into skepticism about the very possibility of performing due diligence on an information source, share in common the same assumption: the only way to achieve knowledge is through Google. In the case of the latter, they infer that because Google is biased and unreliable, objective due diligence and research are at best a waste of time. All we are left with is various short-cuts to the acquisition and evaluation of competing claims, including our gut instinct about what feels right, and our own inner-BS sensor. Through such appeals to gut instinct, common sense, and feelings, individuals are able to maintain confidence in the objectivity of their opinions, even when those opinions have been generated by ad hoc digital retrieval activities.

Perspectivism and Post-Truth

The above heuristics were often justified through appeals to a perspectivist ideology. The idea of “perspectivism” is that there is no objectively correct way of looking at the world; rather, all we have is different people’s opinions. Although conservative Christians have traditionally opposed perspectivism in principle, it nevertheless has surfaced among conservative Christians within the context of information behavior.

Usually I had to explain more about my due diligence procedures to provoke a perspectivist response. Occasionally after saying, “I will be initiating a process of due diligence into what you have shared, and you are welcome to be part of this process,” a person would inquire about my methods. I would begin explaining how Google offers researchers a variety of tools for performing searches with a high degree of granularity that many people do not know about or have never used (Russell, 2019; Schwieder, 2016). I would also point out ways to do research independently of Google, through periodicals, proprietary databases, alternate search engines, and books. Significantly, when I put forward this information I would often receive the following rejoinders, many of which contain perspectivist assumptions:

  • the difference between real news and fake news is just a point of view;
  • everyone has their bias, their particular spin, so by finding online sources that simply back up my own opinion, I’m just doing what everyone else does;
  • everyone cherry-picks resources that fit their point of view – there is no such thing as objectivity;
  • by pressing the necessity of these research skills, you are taking away people’s free will, because each of us has the right to choose what he believes;
  • you can’t even fact-check anymore, because people on the left have their facts and people on the right have their facts – you just have to choose based on your worldview.

These types of statements have become the hallmark of our post-truth culture. The concept of a post-truth world is not that people willingly perpetrate falsehood, but that the very idea of truth and falsehood recede into anachronism. In the most cynical version of the post-truth epistemology, appeals to facts and truth are just proxies for power. The term “alternative facts” is often used within this context. The term was first popularized by Kellyanne Conway during a “Meet the Press interview” on January 22, 2017. When asked why President Trump’s Press Secretary had misreported the size of the inauguration crowd, Conway did not defend the factuality of the President’s original statement but appealed to the idea of “alternative facts.” The phrase “alternative facts” quickly rose to become the most notable quote in 2017 and became a catchphrase highlighting the growing belief that individuals are left to simply choose which facts to accept among competing systems.

In practice, perspectivism creates space for us to take our news from social media. After all, if the difference between real news and fake news is just a point of view, then there is no essential difference between getting information from Facebook vs. NPR. On this way of thinking, actual research simply provides gloss and rationalization for what is fundamentally a retreat to perspective.

Lee McIntyre reports that “62 percent of US adults reported getting their news from social media, and 71 percent of that was from Facebook. This means that 44 percent of the total adult US population now gets its news from Facebook” (McIntyre, 2018, p. 94). It is no coincidence that as increasing numbers of young people are veering towards perspectivism, that more people are getting their news from social media. Yet the cause and effect is not necessarily a one-way street, for one effect of having news customized through social media may be to reinforce the idea of perspectivism.

See Also

Theological Perspectives and Suggestions

Based on these observations, I would like to make five proposals for academic librarians working in Christian institutions, including theological librarians, and freshmen engagement librarians working at theological institutions like seminaries and Christian universities.

First, the research I did was qualitative, and emerged organically from unstructured interviews and conversations. The conversations were not recorded, nor was the data carefully calibrated. It would be profitable for librarians to explore these same questions using rigorous quantitative methods, including testing for differences across various demographics, geographical regions, races, and political beliefs.

Second, work could profitably be undertaken to explore the role librarians can play in laying the philosophical and theological groundwork for teaching information literacy to students.

Many institutions of higher learning have librarians specializing in freshmen engagement and information literacy in order to help incoming students become competent in basic research skills. These librarians will often work alongside professors in various classes to help freshmen become familiar with library resources, including and perhaps especially, the digital resources that now make up the bulk of most library’s holdings. This information literacy work is of crucial importance, but it may not be enough. Given the ideological roadblocks that seem to challenge the very idea of objective research, it may become necessary for librarians first to lay groundwork by clearing away philosophical and theological baggage to show that critical thinking is both possible and necessary within online contexts. This might provide fruitful collaboration between librarians and academics in other departments, including professors of theology and philosophy.

Third, and related to the previous consideration, I suggest that theoretical and practical work could profitably be undertaken on information literacy as mission, and information literacy as pastoral care. Here again, a fruitful collaboration could be achieved by librarians working at Christian institutions with faculty in other departments. The pastoral role of information literacy should be obvious, given the spiritual challenges for anyone living in an environment where it is difficult objectively to distinguish between what is true and what is fake. Yet questions remain that have yet to be adequately explored, including how the deficit in information literacy might impact receptivity to the gospel or Christian doctrine, and whether it might hinder people’s experience of Christian community.

Fourth, Christian librarians may need to start spending more time teaching skills of metacognition and epistemic virtue to help students become alert to factors that can impede proper evaluation of information, and which may even work against desire to take a skills-based approach to information acquisition and evaluation.

One of the problems with many of the heuristics discussed above is that they may offer rationalization for simple hastiness, laziness, confirmation bias, mental inflexibility, etc.. Where such conditions are present, simply teaching students proper skills may be of limited efficacy if the librarian has not first worked with the student to acquire epistemic virtues. Epistemic virtues include such traits as mental flexibility and adaptability, attention to detail, fair-mindedness, tolerance for complexity, metacognition (i.e., ability to watch one’s thought processes, and thus be aware when traits like bias, hastiness, etc. are occurring). Much work needs to be undertaken on the relationship between epistemic virtue and epistemic skills, but the literature in ethics already offers a clue in the rich discussion on the relationship between moral virtues and moral skills. (Zagzebski, 1996, pt. 2) Although this would have to be tested, my hypothesis is that teaching epistemic virtue could help lay the groundwork for instruction in more specific information literacy skills such as effective acquisition and evaluation of digitally-mediated information.

Fifth, librarians could profitably partner with priests and youth pastors to bring information literacy skills and epistemic virtues to high school students in fun and accessible ways. Librarians could offer brief presentations to youth groups, as well as provide “cheat-sheets” to give students a small tool-box of information literacy skills. I am doing that in my church by creating bookmarks with epistemic virtues on one side and best practices for information retrieval and source evaluation on the other side of the bookmark.


I shared how, after I invited friends to pursue joint research on the information they send me, they routinely challenged the very idea of performing objective due diligence. It is significant that their concerns were not that due diligence would be too difficult, or that it would take too long to acquire the skills for critically evaluating online sources; rather, their concern was often that fact-checking and due diligence are not relevant in the age of Google. For some, the very desire to engage in objective fact-checking reveals gullibility concerning the control of the powerful search engine. While my interlocutors understood something important about Google’s censorship of information, they tended to assume that the only way to bypass the hegemony of Google’s control was through a range of heuristics. These heuristics often involved the use of non-rational processes, including a heightened emphasis on feeling, the role of common sense, and users’ beliefs about their own authority and judgment-making capabilities. Through these heuristics, individuals were able to overcome the skepticism they imposed on my fact-checking activities while still maintaining the validity of their own assessments about the digital information they were absorbing and sharing. After attempting to educate friends about critical ways to circumnavigate Google’s control, the original heuristics were sometimes defended from the standpoint of a perspectivist epistemology.

If these trends are at all widespread, then much work is needed, and Christian librarians may be well poised to undertake certain aspects of that work, including interdisciplinary work in the academy and collaborations between academy and church.

Further Reading

Works Cited

Asher, A. D., Duke, L. M., & Wilson, S. (2013). Paths of Discovery: Comparing the Search Effectiveness of EBSCO Discovery Service, Summon, Google Scholar, and Conventional Library Resources | Asher | College & Research Libraries. 74(5).

Bierbaum, E. G. (1990). “Save the Time of the Reader”: A Paradigm for the ’90s. American Libraries, 21(1), 18–19. JSTOR.

Domonoske, C. (2016, November 23). Students Have “Dismaying” Inability To Tell Fake News From Real, Study Finds. NPR.Org.

Epstein, R. (2016, June 22). The New Censorship. US News & World Report. //

Larrick, R. P., Nisbett, R. E., & Morgan, J. N. (1993). Who Uses the Cost-Benefit Rules of Choice? Implications for the Normative Status of Microeconomic Theory. Organizational Behavior and Human Decision Processes, 56(3), 331–347.

McIntyre, L. (2018). Post-truth. MIT Press.

Ranganathan, S. R. (1931). The five laws of library science,. Madras Library Association; E. Goldston.

Russell, D. (2019). The Joy of Search: A Google insider’s guide to going beyond the basics. MIT Press.

Schwieder, D. (2016). Low-effort information searching: The heuristic information-seeking toolkit. Behavioral & Social Sciences Librarian, 35(4), 171–187.

Zagzebski, L. T. (1996). Virtues of the Mind: An Inquiry Into the Nature of Virtue and the Ethical Foundations of Knowledge. Cambridge University Press.

Zipf, G. K. (1949). Human behavior and the principle of least effort: An introduction to human ecology. Hafner.


Scroll To Top