Wisewire Blog

All Articles for How To

Understanding and Avoiding Plagiarism

To anyone in the business of creating content, plagiarism should be a bright red line: Do Not Cross! Yet the question of what plagiarism is, and how to avoid it, is surprisingly complicated. A good starting point is the dictionary. Merriam-Webster defines plagiarize as “to steal and pass off (the ideas or words of another) as one’s own.” This definition emphasizes a key point: officially, plagiarism is an intentional act—akin to stealing and thus unethical (if not always illegal). What muddies the water is that many apparent instances of plagiarism are unintentional—someone copies another person’s ideas or words not from malice or laziness but because they don’t understand how to properly paraphrase or cite their sources.  From a practical standpoint, the question of intent is relatively unimportant: if you do not credit your sources, you expose yourself to all manner of liability, ignominy, and other negative consequences. It is therefore imperative for professional content creators to understand exactly what plagiarism is. (I refer to content creators rather than writers, since one can plagiarize aural and visual content—including pictures, video, and audio recordings—as well as text.) Let’s debunk some common misconceptions.
MISCONCEPTION 1: It isn’t plagiarism if you change the words.
This is actually two errors. First, avoiding plagiarism requires more than replacing every other word with a synonym. Compare these two statements. Here is the original, from Charles Dickens’s 1850 novel, David Copperfield: Whether I shall turn out to be the hero of my own life, or whether that station will be held by anybody else, these pages must show. Now read this “paraphrase”: If I end up as the protagonist of my personal existence, or if this role is taken by another person, this book will tell. In addition to being clunky, this is still plagiarism! Remember, plagiarism refers to ideas as well as words, and the specific way that an author arranges words on a page—the syntax and structure of a phrase, sentence, or paragraph—is a crucial part of how authors express their ideas. To truly put a statement into “your own words,” you must put different words in a different structure. What about the second error? Even if you appropriately paraphrase someone else’s idea, you must still give that person credit. A successful paraphrase may even incorporate a quotation, as long as it clearly identifies the source: David Copperfield, the protagonist of Charles Dickens’s 1850 novel, introduces himself to readers by wondering if he or another person shall be “the hero of my own life.”
MISCONCEPTION 2: You can’t plagiarize from the public domain.
As noted, the novel David Copperfield was published in 1850. It is in the public domain, which means it is not protected by copyright and anyone can reproduce it, in any manner, without legal liability. (How to determine what is in the public domain could be the subject of another post, but generally, anything published prior to 1923 is free to use.) Plagiarism and copyright are two distinct concepts, however. Just because the law permits you to freely reproduce content, ethics and integrity—not to mention, if you are being paid for your work, a contractual obligation—require you to credit the source of that content. Otherwise, you are guilty of plagiarism.
MISCONCEPTION 3: You can reproduce any content as long as you credit the source.
This is essentially the reverse of the previous error. In fact, if content is protected by copyright, you are not free to reproduce it. That is the purpose of copyright: to limit the right to copy a work. To reproduce protected content, you must get the permission of the copyright holder—a time-consuming process that typically involves payment. And of course, if you do get permission, you must credit the creator. Some people assume the principle of fair use permits them to reproduce part of a copyright work for a limited range of (usually noncommercial) purposes: for example, printing an excerpt from a speech or a photograph of a march in a history textbook. In fact, fair use is a set of guidelines for judges to apply on a case-by-case basis; as the U.S. Copyright Office emphasizes, “the outcome of any given case depends on a fact-specific inquiry.” In other words, there is no clear-cut statute that content creators can rely on. (Brief quotations for the purpose of commentary or criticism are perhaps the safest application of fair use—it is the reason critics may include quotes from the latest releases in their reviews.) If you reproduce copyright content without authorization, regardless of whether you credit the source, you may be sued, and you will have to convince a judge that your usage is genuinely “fair.”
MISCONCEPTION 4: As long as I’m copying common knowledge, it’s not plagiarism.
This is less a misconception than an oversimplification. It is true that sources of common, or general, knowledge do not have to be cited . . . but what knowledge counts as common? Unfortunately, different sources have different answers! According to the Purdue Online Writing Lab, “you can regard something as common knowledge if you find the same information undocumented in at least five credible sources.” On the other hand, Massachusetts Institute of Technology (MIT) gives vaguer advice: “Broadly speaking, common knowledge refers to information that the average, educated reader would accept as reliable without having to look it up.”  Keep in mind that different audiences have different storehouses of common knowledge. If you are writing an academic paper for chemists, for example, the “average, educated reader” will accept many more statements as commonly known than if you are explaining a simple experiment to middle schoolers. Remember too to paraphrase the common knowledge and—as always—cite your source! Ultimately, the advice on this topic boils down to these key points: When in doubt, use your own words and cite your source. Better for your audience to absorb an unnecessary citation than to think a missing citation is evidence of plagiarism. * * * * * As content creators, we have a variety of reasons for incorporating into our work other people’s words and ideas. Among these reasons is the desire to honor and respect those whose own work has made possible and informed ours. Seen in this light, plagiarism is the ultimate sign of disrespect. Understand it, and avoid it. Main Sources: https://owl.purdue.edu/owl/research_and_citation/using_research/avoiding_plagiarism/is_it_plagiarism.html http://wpacouncil.org/aws/CWPA/pt/sd/news_article/272555/_PARENT/layout_details/false https://lsa.umich.edu/sweetland/undergraduates/writing-guides/how-do-i-effectively-integrate-textual-evidence-.html   

Evaluating Sources

We live in an information age, you may have heard people say. Indeed, typing “information age” into a popular search engine just now produced over 4 billion results in half a second! The problem is that much—perhaps most—of the information available, whether in print or online, is unsubstantiated, misleading, or downright false. How can we determine whether the information we search for is true? In practice, we don’t usually verify particular facts. Instead, we focus on evaluating the credibility, or trustworthiness, of a fact’s source. I don’t know from personal experience or observation that Earth is one of eight planets in the solar system or that Abraham Lincoln delivered the Gettysburg Address on November 19, 1863. Instead, I accept these claims as true because I have learned them from sources I trust. The better question, then, is how we may determine whether a source is credible. Unfortunately, there is no set of rules guaranteed to sort the good sources from the bad. Even the best sources—the most historically trusted and reliable—sometimes make mistakes or give incomplete accounts. Evaluating sources, then, should be a flexible process designed to help us do two main things:
  1. Identify untrustworthy sources and reject them.
  2. Identify promising sources and establish their credibility. 
The first goal can often be achieved quickly if we know how to spot red flags. The second goal must be pursued more methodically. In both cases, however, the process begins with asking questions. Here are some important questions to ask:

Who is the source’s author?

By “author,” I mean the person giving the information, whether through text, speech, film, or some other medium. The first step in evaluating a source is to identify the author and determine their qualifications: what authority or expertise supports their claims? If you cannot learn an author’s qualifications, you have no reason to trust them. Qualifications do not have to be formal; someone who has lived in another country may know more about its culture than an academic who has never visited. However, they should be relevant to the specific topic being discussed. A professor of U.S. history is likely an authoritative source of information about democracy in the United States, but you should be more skeptical of claims they make about democracy in ancient Greece or communism in China. You should also investigate whether an author has any biases or conflicts of interest that may hinder their ability to explain or describe something fairly. A politician sponsoring a bill may not be an objective source of information about that bill. A researcher who has been paid by a toothpaste manufacturer to study the benefits of a new brand of toothpaste may feel pressure to report positive results. Just as someone who lacks formal qualifications may still be a credible source, an apparent conflict of interest doesn’t necessarily prevent someone from giving accurate information. But it puts a greater onus on their audience to seek out additional information from less biased sources.
An example . . .
The importance of identifying and “vetting” a source’s author is a major reason why Wikipedia is not generally considered credible. Contrast Wikipedia’s article about World War II with Encyclopedia Britannica’s: At the top of its article, Encyclopedia Britannica clearly identifies the two primary authors by name; each name links to a biography, revealing one author to be an academic with extensive relevant experience, and the other author to be an associate editor for the encyclopedia. (Encyclopedia Britannica has a long, trusted history as a credible publisher of information, and therefore—as the next section will emphasize—it is reasonable to trust its editors.) In contrast, Wikipedia does not identify authors by name. Anyone can view the history of changes to an article (click on the “View history” tab, located at the top-right of the page), but these changes are attributed only to usernames; some usernames do provide a biography, but these biographies cannot be verified. (Encyclopedia Britannica also allows users to view an article’s history, but these changes are attributed to specific employees of the company.)

How has the information been published?

Usually the author of information is different from the person or organization that publishes it, but sometimes they are the same. Regardless, many of the questions to ask about a source’s author also apply to its publisher. Does the publisher have a longstanding reputation for integrity, for example, or a history of bias, egregious error, or propaganda? Generally, if you can establish a publisher’s credibility, you may assume the publisher has appropriately vetted its authors. Keep in mind, however, that credible publishers may produce a variety of content. A newspaper has sections for news as well as opinions, for example, and opinions are not usually expected to be authoritative or unbiased. The question of how information has been published is particularly important when evaluating online sources. You can make some broad assumptions from  a website’s domain extension: the brief (usually three letters) code that follows the “dot” in a URL. For example, .gov is used by government agencies and .edu by academic institutions. These sources are likelier to be credible than websites that end in .com, which typically belong to commercial companies or private individuals, or .org, which was initially intended for non-profit organizations but may in fact be used by anyone who pays the registration fee. Don’t take anything for granted, however. Many colleges and universities host personal web pages for their students—these pages also use the extension .edu, but they are less credible than pages created by faculty. And a commercial website may provide useful information about a company’s products or history. Finally, it is worth considering a source’s overall appearance and design. While an attractive façade is no guarantee of quality, it can be a sign of professionalism and competence—just as typos and disorganization may signal the opposite. A publisher that consistently produces clear, clean content probably also has the resources to hold its authors to high standards and fact-check their claims.

How current is the information?

As a general rule, recent sources are preferable to older ones. This is especially true in the sciences, as new data make it necessary to revise or even reject theories that once were widely accepted. Historical accounts may also become fuller and more nuanced as time passes and people gain perspective and access to a wider range of sources. On the other hand, the latest “discoveries” aren’t necessarily true; if an old claim has never been disproved, its age may be strong evidence of its truth.

What evidence does the source present for a claim?

Of course, credible sources do more than note the lack of contradictory evidence for a claim—they actively provide evidence that the claim is true. Whatever form this evidence takes—quantitative data, personal observations, quotations from experts—it should be possible to evaluate its quality. Does the source explain how data were gathered, for example, and do these methods follow best practices? Are the personal observations, or the experts’ expertise, relevant to the claim? If not, why not? Was the author simply sloppy, or might their biases have led them to misuse or misinterpret evidence? You may have to do additional research to answer these questions—which bring us to one more:

Do other credible sources support the claim?

Information should not be examined in isolation. Especially if you are not yourself an expert, the best strategy for evaluating a given source, claim, or piece of evidence is to conduct research from a variety of credible sources. The more you learn about a topic, the better prepared you are to determine the general consensus regarding that topic. You can then consider any particular statement in the light of that consensus. If the statement is inconsistent with or contradicted by the consensus, but its author is otherwise credible, that may be an occasion for further research. Perhaps the author of an eccentric claim has discovered something new that forces you to reevalute your other sources. Or perhaps you can dismiss the claim—and even, by extension, the author. * * * * * Ultimately, an effective process for evaluating sources must be adaptable. Different sources may require you to ask different questions and to follow up in different ways. You can do everything “right” and never be positive that a claim is true; contradictory information may surface years later. Often the most you can do, as a consumer of information, is consume from a variety of credible sources and trust the overlapping material to reveal the truth.

Main Sources:




Standout with these 5 Lesson Improvement Tips

It’s a necessary evil based in scientific research. Tedious, time-consuming, and difficult are among the most common words used to describe the process. Yet, whether you’re anxiously typing away with excitement or it’s something you loathe entirely, lesson planning continues to be a gold standard in today’s teaching industry (and for good reason) Continue reading

Top Summer PD Opportunities

There is a heavy focus on the importance of students learning how to learn and how to lead. Multiple guess questions are becoming a thing of the past, as we educators seek to challenge students to apply what they’re learning instead of simply memorizing and regurgitating it Continue reading