a person holding a the law book

Types of Justice in the Classroom

Justice can look many different ways. In this post, we will look at three different forms of justice procedural, substantive, and negotiated. In particular, we will look at how these different forms of justice work within the classroom.

Procedural Justice

Procedural justice means that the disciplinary power of the teacher is only used within the constraints of the policies and rules of the school. For example, most schools do not allow corporal punishment. What this means is that a teacher who makes the decision to spank a student has violated what is considered to be an acceptable process for discipline within that school.


Procedural justice also has to do with maintaining fairness. In other words, rules cannot be randomly enforced based on a teacher’s mood. When teachers are not consistent in the application and enforcement of rules it gives the appearance of unfairness and injustice to the students. When this happens it can trigger even more undesirable behavior from students.

However, everyone has their moments of inconsistencies, including teachers. Therefore, when a teacher makes a mistake in procedural justice it is wise to acknowledge the mistake and make efforts to correct the misstep. Doing this will help students to maintain faith in a system that when it makes mistakes it tries to correct them.

Substantive Justice

Substantive justice is the unequal impact enforcing rules has on different groups. A common example of substantive justice in the classroom is the disproportional amount of trouble males and minorities get into within the classroom.

Dealing with race and gender are both highly controversial topics. Therefore, teachers must be careful to be aware of these two demographic traits of their students. The perception of differences in justice due to substantive differences in demographic traits could lead to serious accusations and headaches.

Negotiated Justice

Negotiated justice is the process of how justice is discovered and carried out. A practical example would be a court trial. During the trial, the truth is sought so that justice can be delivered. In the classroom, there are many different ways in which teachers uncover what to do when it is time to administer justice.

For example, in some classes, a teacher will have both parties sit down and discuss what happened. In other classes, the students may be sent to the office to work out their disagreement. If the teacher witnessed what happened, there may be no questioning at all.

The ultimate point here is that a teacher needs to be aware of how they go about determining guilt and innocence in their classroom. At times, the emotions of teachers will overwhelm them and they may make just or unjust decisions without knowing how they made their decision. Naturally, we want to avoid unjust decisions but no matter what decision was made it is important to be aware of how the decision was developed.


Teachers must be careful with how they deal with justice in their classrooms. There is always a danger of being accused of oppression when you have power and authority over others. Awareness is at least one way that this problem can be avoided.

close up photography of person in handcuffs

Views of Punishment

Punishment is a part of juvenile justice. However, as with most ideas and concepts, there is disagreement over the role and function of punishment. In this post, we will look at common positions in relation to punishment.


The reductivist position on punishment views punishment as a means to prevent future crimes. This approach is based on a utilitarian position of causing the most happiness for the most people. By focusing on future crimes it is believed that preventing these crimes will bring the most harmony and happiness to people rather than looking at what has already happened.


There are several strategies that are used to support a reductivist approach. For example, the use of deterrence. Deterrence is the use of punishment to prevent crime by instilling fear. An example of deterrence would be capital punishment. Through hanging or public execution, the thought is that this will motivate others to be good. Other forms of deterrence that are used today would be boot camps which are meant to whip delinquent youths into shape and in some countries, corporal punishment such as caning is employed to maintain order.

Another manifestation of reductivism is reform-rehabilitation. Reform is meant to mean hard labor, such as working in a chain gang along with religious instruction. Rehabilitation involves treatment for some sort of vice that may have led to incarceration such as substance abuse, sex treatment, etc. The assumption is that there is something wrong with the prisoner that can be fixed through treatment. Again, the motivation behind reform and rehabilitation is to change the person for the benefit of society.

A final form of reductivism is incapacitation. Incapacitation is simply a strategy of keeping offenders locked up to protect the public. One way this was done was through the three strikes law used in parts of the United States. Once a person committed a third felony the sentencing could be 25 years to life.


The retributivist position looks to punish people for crimes already committed with no regard for the future. In other words, retributivists focus on the past while reductivists focus on the future. Punishment should restore equilibrium and focus on what is right to do rather than what is good to do (utilitarian position). The reason for this distinction is that right and wrong are more immovable than what makes people happy.

The main strategy for retribution is just deserts. Just deserts are a way of punishing people for the crimes they have committed and doing no more. As such, there is no support for three-strike laws, deterrence, or other methods among people who have a retributivist perspective.


The point is not to state that one of these positions is superior to the other. Rather, the goal was to explain these two different positions to inform the reader about them. There are times and circumstances in which one of the positions would be a better position than the other.

black internal hdd on black surface

Terms Related to Data Storage

There are several different terms used when referring to data within an organization that can become confusing for people who are not experts in this field. In this post, we will look at various terms that are often misused in the field of data management.


Databases are for structured data which is data that has rows and columns. Among the many benefits of using a database over an Excel spreadsheet is that databases can hold almost limitless amounts of data. In addition, databases can have multiple users querying and inputting data at the same time which is not possible with a spreadsheet.

Data Warehouse

A data warehouse is a computer system designed to store and analyze large amounts of data for an organization. The data for a data warehouse can come from various areas within the organization. Since the data comes from many different places it also helps to integrate data for the purpose of analysis which is valuable for decision-making and insights.


Data warehouses take pressure off databases by providing another location for data. However, because of their size, often over 100 GB, data warehouses are hard to change once they are up and running. Therefore, great care is needed when developing and using this tool.

Data Marts

Data marts are similar to data warehouses with the main difference being the scope. Like data warehouses, data marts are also databases. However, data marts are focused on one subject or department whereas data warehouses gather data from all over an organization. For example, a school might have a data warehouse for all student data while it has a data mart that only holds student classes and grades.

Since they have a focus on a given subject, data marts are generally smaller than data warehouses at less than 100 GB. The rationale of a data mart is that analytic teams can focus when trying to develop insights rather than searching through a larger data warehouse.

Data Lake

Data lakes are also similar to data warehouses. Just like a data warehouse data lakes contain data from all over the organization from many sources. Data lakes are also generally larger than 100 GB. One of the main differences is that data lakes contain structured and unstructured data. Unstructured data is data that does not fit into rows and columns. Examples can include video data, social media, and images.

Another purpose for a data lake is to have a place for keeping data that may not have a specific purpose yet. Another to think of this is to consider a data lake as a historical repository of data. Due to their multipurpose nature, data lakes are often less complex in comparison to data warehouses.


All of the various data products discussed here work together to give an organization access to its data. It is important to understand these different terms because it is common for people to use them interchangeably to the confusion of everyone involved. With consistent terminology, everyone can be on the same page when it comes to delivering value through using data.

similar cubes with rules inscription on windowsill in building

Types of Data Quality Rules

Data quality rules are for protecting data from errors. In this post, we will learn about different data quality rules. In addition, we will look at tools used in connection with data quality rules.


Detective rules monitor data after it has already moved through a pipeline and is being used by the organization. Detective rules are generally used when the issues that are being detected are not causing a major problem when the issue cannot be solved quickly, and when a limited number of records are affected.

Of course, all of the criteria listed above are relative. In other words, it is up to the organization to determine what thresholds are needed for a data quality rule to be considered a detective rule.


An example of a detective data quality rule may be a student information table that is missing a student’s uniform size. Such information is useful but probably not worthy enough to stop the data from moving to others for use.


Preventive data quality rules stop data in the pipeline when issues are found. Preventive rules are used when the data is too important to allow errors, when the problem is easy to fix, and or when the issue is affecting a large number of records. Again, all of these criteria are relative to the organization.

An example of a violation of a data quality prevention rule would be a student records table missing student ID numbers. Generally, such information is needed to identify students and make joins between tables. Therefore, such a problem would need to be fixed immediately.

Thresholds & Anomaly detection

There are several tools for implementing detection and prevention data quality rules. Among the choices are the setting of thresholds and the use of anomaly detection.

Thresholds are actions that are triggered after a certain number of errors occurred. It is totally up to the organization to determine how to set up their thresholds. Common levels include no action, warning, alert, and prevention. Each level must have a minimum number of errors that must occur for this information to be passed on to the user or IT.

To make things more complicated you can tie threshold levels to detective and preventive rules. For example, if a dataset has 5% missing data it might only flag it as a warning threshold. However, if the missing data jumps to 10% it might now be a violation of a preventative rule as the violation has reached the prevention level.

Anomaly detection can be used to find outliers. Unusual records can be flagged for review. For example, a university has an active student who was born in 1920. Such a birthdate is highly unusual and the system should flag it as an outlier by the rule. After reviewing, IT can decide if it is necessary to edit the record. Again, anomaly detection can be used to detect or prevent data errors and can have thresholds set to them as well.


Data quality rules can be developed to monitor the state of data within a system. Once the rules are developed it is important to determine if they are detective or preventative. The main reason for this is that the type of rule affects the urgency with which the problem needs to be addressed.

person in white long sleeve shirt holding credit card

Data Profile

One aspect of the data governance experience is data profiling. In this post we will look at what a data profile is, an example of a simple data profile, and the development of rules that are related to the data profile.


Data profiling is the process of running descriptive statistics on a dataset to develop insights about the data and field dependencies. Some questions there are commonly asked when performing a data profile includes.

  • How many observations are in the data set?
  •  What are the min and max values of a column(s)?
  •  How many observations have a particular column populated with a value (missing vs non-missing data)?
  •  When one column is populated what other columns are populated?

Data profiling helps you to confirm what you know and do not know about your data. This knowledge will help you to determine issues with your data quality and to develop rules to assess data quality.

Student Records Table


The first column from the left is the student id. Looking at this column we can see that there are five records with data. That this column is numeric with 4 characters. The minimum value is 1001 and the max value is 1005.

The next two columns are first name and last name. Both of these columns are string text with a min character length of 5 and a max length of 7 for first name and 5 for last name. For both columns, 80% of the records are populated with a value. In addition, 60% of the records have a first name and a last name.


The fourth column is the birthdate. This column has populated records 80% of the time and all rows follow a MM/DD/YYYY format. The minimum value is 04/04/2000 and the max value is 01/01/2005. 40% of the rows have a first name, last name, and birthdate.

Lastly, 100% of the class-level column is populated with values. 20% of the values are senior, 40% are junior, 20% are sophomore, and 20% are freshman.

Developing Data Quality Rules

From the insights derived from the data profile, we can now develop some rules to ensure quality. With any analysis or insight the actual rules will vary from place to place based on needs and context but below are some examples for demonstration purposes.

  • All StudentID values must be 4 numeric characters
  •  The Student ID values must be populated
  •  All StudentFirstName values must be 1-10 characters in length
  •  All StudentLastName values must be 1-10 characters in length
  •  All StudentBirhdate values must be in MM/DD/YYYY format
  •  All StudentClassLevel values must be Freshman, Sophomore,, Junior, or Senior


A data profile can be much more in-depth than the example presented here. However, if you have hundreds of tables and dozens of databases this can be quite a labor-intensive experience. There is software available to help with this but a discussion of that will have to wait for the future.

rows of different lenses for checking eyesight

Data Quality

Bad data leads to bad decisions. However, the question is how can you know if your data is bad. One answer to this question is the use of data quality metrics. In this post, we will look at a definition of data quality as well as metrics of data quality


Data quality is a measure of the degree that data is appropriate for its intended purpose. In other words, it is the context in which the data is used that determines if it is of high quality. For example, knowing email addresses may be appropriate in one instance but inappropriate in another instance.


When data is determined to be of high quality it helps to encourage trust in the data. Developing this trust is critical for decision-makers to have confidence in the actions they choose to take based on the data that they have. Therefore data quality is of critical importance for an organization and below are several measures of data quality.

Measuring Data Quality

Completeness is a measure of the degree to which expected columns (variables) and rows (observations) are present. There are times when data can be incomplete due to missing data and or missing variables. There can also be data that is partially completed which means that data is present in some columns but not others. There are various tools for finding this type of missing data in whatever language you are using.

Validity is a measure of how appropriate the data is in comparison to what the data is supposed to represent. For example, if there is a column in a dataset that measures the class level of high school students using Freshman, Sophmore, Junior, and Senior. Data would e invalid if it use the numerical values for the grade levels such as 9, 10, 11, and 12. This is only invalid because of the context and the assumptions that are brought to the data quality test.

Uniqueness is a measure of duplicate values. Normally, duplicate values happen along rows in structured data which indicates that the same observation appears twice or more. However, it is possible to have duplicate columns or variables in a dataset. Having duplicate variables can cause confusion and erroneous conclusions in statistical models such as regression.

Consistency is a measure of whether data is the same across all instances. For example, there are times when a dataset is refreshed overnight or whenever. The expectation is that the data should be mostly the same except for the new values. A consistency check would assess this. There are also times when thresholds are put in place such that the data can be a little different based on the parameters that are set.

Timeliness is the availability of the data. For example, if data is supposed to be ready by midnight any data that comes after this time fails the timeliness criteria. Data has to be ready when it is supposed to be. This is critical for real-time applications in which people or applications are waiting for data.

Accuracy is the correctness of the data. The main challenge of this is that there is an assumption that the ground truth is known to make the comparison. If a ground truth is available the data is compared to the truth to determine the accuracy.


The metrics shared here are for helping the analyst to determine the quality of their data. For each of these metrics, there are practical ways to assess them using a variety of tools. With this knowledge, you can be sure of the quality of your data.

Reciprocal Teaching VIDEO

One goal of many teachers is to help their students to become independent and self-directed learners. One tools for achieving autonomous learners is the use of reciprocal teaching. The video below explains the steps involved in utilizing reciprocal teaching in the classroom.

man showing distress

Data Governance Solutions

Data governance is good at indicating various problems an organization may have with data. However, finding problems doesn’t help as much as finding solutions does. This post will look at several different data governance solutions that deal with different problems.

Business Glossary

The business glossary contains standard descriptions and definitions. It also can contain business terms or discipline-specific terminology. One of the main benefits of developing a business glossary is creating a common vocabulary within the organization.

Many if not all businesses and fields of study have several different terms that mean the same thing. In addition, people can be careless with terminology, to the confusion of outsiders. Lastly, sometimes a local organization will have its own unique terminology. No matter the case the business dictionary helps everyone within an organization to communicate with one another.


An example of a term in a business dictionary might be how a school defines a student ID number. The dictionary explains what the student ID number is and provides uses of the ID number within the school.

Data Dictionary

The data dictionary provides technical information. Some of the information in the data dictionary can include the location of data, relationships between tables, values, and usage of data. One benefit of the data dictionary is that it promotes consistency and transparency concerning data.

Returning to our student ID number example, a data dictionary would share where the student ID number is stored and the characteristics of this column such as the ID number being 7 digits. For a categorical variable, the data dictionary may explain what values are contained within the variable such as male and female for gender.

Data Catalog

A data catalog is a tool for metadata management. It provides an organized inventory of data within the organization. Benefits of a data catalog include improving efficiency and transparency, quick locating of data, collaboration, and data sharing.

An example of a data catalog would be a document that contains the metadata about several different data warehouses or sources within an organization. If a data analyst is trying to figure out where data on student ID numbers are stored they may start with the data catalog to determine where this data is. The data dictionary will explain the characteristics of the student ID column. Sometimes the data dictionary and catalog can be one document if tracking the data in an organization is not too complicated. The point is that the distinction between these solutions is not obvious and is really up to the organization.

Automated Data Lineage

Data lineage describes how data moves within an organization from production to transformation and finally to loading. Tracking this process is really complicated and time-consuming and many organizations have turned to software to complete this.

The primary benefit of tracking data lineage is increasing the trust and accuracy of the data. If there are any problems in the pipeline, data lineage can help to determine where the errors are creeping into the pipeline.

Data Protection, Privacy, QUailty 

Data protection is about securing the data so that it is not tampered with in an unauthorized manner. An example of data protection would be implementing access capabilities such as user roles and passwords.

Data privacy is related to protection and involves making sure that information is restricted to authorized personnel. Thus, this also requires the use of logins and passwords. In addition, classifying the privacy level of data can also help in protecting it. For example, salaries are generally highly confidential while employee work phone numbers are probably not.

Data quality involves checking the health of the accuracy and consistency of the data. Tools for completing this task can include creating KPIs and metrics to measure data quality, developing policies and standards that defined what is good data quality as determined by the organization, and developing reports that share the current quality of data.


The purpose of data governance is to support an organization in maintaining data that is an asset to the organization. In order for data to be an asset it must be maintained so that the insights and decisions that are made from the data are as accurate and clear as possible. The tools described in this post provide some of the ways in which data can be protected within an organization.

my secret plan to rule the world book

Data Governance Strategy

A strategy is a plan of action. Within data governance, it makes sense to ultimately develop a strategy or plan to ensure data governance takes place. In this post, we will look at the components of a data governance strategy. Below are the common components of a data governance strategy.

  • Approach
  •  Vision statement
  •  Mission statement
  •  Value proposition
  •  Guiding principles
  •  Roles & Responsibilities

There is probably no particular order in which these components are completed. However, they tend to follow an inverted pyramid in terms of the scope of what they deal with. In other words, the approach is perhaps the broadest component and affects everything below it followed by the vision statement, etc. Where to begin probably depends on how your mind works. A detail-oriented person may start at the bottom while a big-picture thinker would start at the top.

Defined Approach

The approach defines how the organization will go about data governance. There are two extremes for this and they are defensive and offensive. A defensive approach is focused on risk mitigation while an offensive approach is focused more on achieving organizational goals.


Neither approach is superior to the other and the situation an organization is in will shape which is appropriate. For example, an organization that is struggling with data breaches may choose a more defensive approach while an organization that is thriving with allegations may take a more offensive approach.

Vision Statement

A vision statement is a brief snapshot of where the organization wants to be. Another way to see this is that a vision statement is the purpose of the organization. The vision statement needs to be inspiring and easily understood. It also helps to align the policies and standards that are developed.

An example of a vision statement for data governance is found below.

Transforming how data is leveraged to make informed decisions to support youth served by this organization

The vision is to transform data for decision-making. This is an ongoing process that will continue indefinitely.

Mission Statement

The mission statement explains how an organization will strive toward its vision. Like a vision statement, the mission statement provides guidance in developing policies and standards. The mission statement should be a call to action and include some of the goals the organization has about data. Below is an example

Enabling stakeholders to make data-driven decisions by providing accurate, timely data and insights

In the example above, it is clear that accuracy, timeliness, and insights are the goals for achieving the vision statement. In addition, the audience is identified which is the stakeholders within the organization.

Value Proposition

The value proposition provides a justification or the significance of adopting a data governance strategy. Another way to look at this is an emphasis on persuasion. Some of the ideas included in the value proposition are the benefits of implementation. Often the value proposition is written in the form of cause and effect statement(s). Below is an example

By implementing this data governance program we will see the following benefits: 

Improved data quality for actionable insights, increased trust in data for making decisions, and clarity of roles and responsibilities of analysts

In the example above three clear benefits are shared. Succinctly this provides people with the potential outcomes of adopting this strategy. Naturally, it would be beneficial to develop ways to measure these ideas which means that only benefits that can be measured should be a part of the value proposition.

Guiding Principles

Guiding principles define how data should be used and managed. Common principles include transparency, accountability, integrity, and collaboration. These principles are just more concrete information for shaping policies and standards. Below is an example of a guiding principle.

All data will have people assigned to play critical roles in it

The guiding principle above is focused on accountability. Making sure all data has people who are assigned to perform various responsibilities concerning it is important to define and explain.

Roles & Responsibilities

Roles and responsibilities are about explaining the function of the data governance team and the role each person will play. For example, a small organization might have people who adopt more than one role such as being data stewards and custodians while larger organizations might separate these roles.

In addition, it is also important to determine the operating model and whether it will be centralized or decentralized. Determining the operating model again depends on the context and preferences of the organization.

It is also critical to determine how compliance with the policies and standards will be measured. It is not enough to say it, eventually, there needs to be evidence in terms of progress and potential changes that need to be made to the strategy. For example, perhaps a data audit is done monthly or quarterly to assess data quality.


Having a data governance strategy is a crucial step in improving data governance within an organization. Once a plan is in place it is simply a matter of implementation to see if it works.

white dry erase board with red diagram

Data Governance Assessment

Before data governance can begin at an organization it is critical to assess where the organization is currently in terms of data governance. This necessitates the need for a data governance assessment. The assessment helps an organization to figure out where to begin by identifying challenges and prioritizing what needs to be addressed. In particular, it is common for there to be five steps in this process as shown below.

  1. Identify data sources and stakeholders
  2.  Interview stakeholders
  3.  Determine current capabilities
  4.  Document the current state and target state
  5.  Analyze gaps and prioritize

We will look at each of these steps below.

Identify Data Sources and Stakeholders

Step one involves determining what data is used within the organization and the users or stakeholders of this data. Essentially, you are trying to determine…

  • What data is out there?
  •  Who uses it?
  •  Who produces it?
  •  Who protects it?
  •  Who is responsible for it?

Answering these questions also provides insights into what roles in relation to data governance are already being fulfilled at least implicitly and which roles need to be added to the organization. At most organizations at least some of these questions have answers and there are people responsible for many roles. The purpose here is not only to get this information but also to make people aware of the roles they are fulfilling from a data governance perspective.


Interview Stakeholders

Step two involves interviewing stakeholders. Once it is clear who is associated with data in the organization it is time to reach out to these people. You want to develop questions to ask stakeholders in order to inform you about what issues to address in relation to data governance.

An easy way to do this is to develop questions that address the pillars of data governance. The pillars are…

  • Ownership & accountability
  •  Data quality
  •  Data protection and privacy
  •  Data management
  •  Data use

Below are some sample questions based on the pillars above.

  • How do you know your data is of high quality
  •  What needs to be done to improve data quality
  •  How is data protected from misuse and loss
  •  How is metadata handle
  •  What concerns do you have related to data
  •  What policies are there now related to data
  •  What roles are there in relation to data
  •  How is data used here

It may be necessary to address all or some of these pillars when conducting the assessment. The benefit of these pillars is they provide a starting point in which you can shape your own interview questions. In terms of the interview, it is up to each organization to determine what is best for data collection. Maybe a survey works or perhaps semi-structured interviews or focus groups. The actual research part of this process is beyond the scope of this interview.

Determine Current Capabilities

Step three involves determining the current capabilities of the organization in terms of data governance. Often this can be done by looking at the stakeholder interviews and comparing what they said to a rating scale. For example, the DCAM rating scale has six levels of data governance competence as shown below.

  1. Non-initiated-No governance happening
  2.  Conceptual-Aware of data governance and planning
  3.  Developmental-Engaged in developing a plan
  4.  Defined-PLan approved
  5.  Achieved-Plann implemented and enforced
  6.  Enhanced-Plan a part of the culture and updated regularly

Determining the current capabilities is a subjective process. However, it needs to be done in order to determine the next steps in bringing data governance along in an organization.

Document Current State and Target State

Step four involves determining the current state and determining what the target state is. Again, this will be based on what was learned in the stakeholder interviews. What you will do is report what the stakeholders said in the interviews based on the pillars of data governance. It is not necessary to use the pillars but it does provide a convenient way to organize the data without having to develop your own way of classifying the results.

Once the current state is defined it is now time to determine what the organization should be striving for in the future and this is called the target state. The target state is the direction the organization is heading within a given timeframe. It is up to the data governance team to determine this and how it is done will vary. The main point is to make sure not to try and address too many issues at once and save some for the next cycle.

Analyze and Prioritize

The final step is to analyze and prioritize. This step involves performing a gap analysis to determine solutions that will solve the issues found in the previous step. In addition, it is also important to prioritize which gaps to address first.

Another part of this step is sharing recommendations and soliciting feedback. Provide insights into which direction the organization can go to improve its data governance and allow stakeholders to provide feedback in terms of their agreement with the report. Once all this is done the report is completed and documented until the next time this process needs to take place.


The steps presented here are not prescriptive. They are shared as a starting point for an organization’s journey in improving data governance. With experience, each organization will find its own way to support its stakeholders in the management of data.


Total Data Quality

Total data quality as its name implies is a framework for improving the state of data that is used for research and reporting purposes. The dimensions that are used to assess the quality of data are measurement and representation


Measurement is focused on the values gathered on the variable(s) of interest. When assessing measurement researchers are concerned with.

  • Construct-The construct is the definition of the variable of interest. For example, income is can be defined as a person’s gross yearly salary in dollars. However, salary can also be defined as per month or as the net after taxes to show how this construct can be defined differently. The construct validity must also be determined to ensure that it is measuring what it claims to measure.
  • ¬†Field-This is the place where data is collected and how it is collected. For example, our income variable can be collected from students or working adults. Where the data comes from affects the quality of the data concerning the research problem and questions. If the research questions are focused on student income then collecting income data from students ensures quality. In addition, how the data is encoded matters. All student incomes need to be in the same currency in order to make sense for comparision
  •  Data Values-This refers to the tools and procedures for preparing the data for analysis to ensure high-quality values within the data. Such challenges addressed are dealing with missing data, data entry errors, duplications, assumptions for various analytical approaches, and or issues between variables such as high correlations.


Representation looks at determining if the data collected comes from the population of interest. Several concerns need to be addressed when dealing with representation.

  • Target population- The target population is potential participants in the study. The limitation here is determining the access of the target population. For example, studies involving children can be difficult because of ethical concerns over data collection with children. These ethical concerns limit access at times.
  •  Data sources- Data sources are avenues for obtaining data. It can relate to a location such as a school or to a group of people such as students among other definitions. Once access is established it is necessary to specifically determine where the data will come from.
  •  Missing data-Missing data isn’t just looking at what data is not complete in a dataset. Missing data is also about looking at who was left out of the data collection process. For example, if the target population is women then women should be represented in the data. In addition, missing data can also look at who is represented in the data but should not be. For example, if women are the target population then there should not be any men in the dataset.

Where measurement and representation meet is at the data analysis part of a research project. If the measurement and representation are bad it is already apparent that the data analysis will not yield useful insights. However, if the measurement and representation are perfect but the analysis is poor then you are still left without useful insights.


Measurement and representation are key components of data quality. Researchers need to be aware of these ideas to ensure that they are providing useful results to whatever stakeholders are involved in a study.

photo of assorted acoustic guitars

Data Types

There are many different ways that data can be organized and classified. In this post, we will look at data as it is classified by purpose. Essentially, data can be gathered for non-research or research purposes. Data collected for non-research purposes is called gathered data and data collected for research purposes is called designed data.

Gathered Data

Gathered data is data that is obtained from sources that were not developed with the intention of conducting research specifically. Examples of gathered data would be data found in social media such as Twitter or YouTube and data that is scraped from a website. In each of those examples, data was collected but not necessarily for an empirical theory testing purpose.

Gathered data is also collected in many ways beyond websites. Other modes of data collection could be sensors such as traffic light cameras, transactions such as those at a store, and wearables such as those used during exercise.


Just because the data was not collected for research purposes does not mean that it cannot be used for this purpose. Gathered data is frequently used to support research as it can be analyzed and insights developed from it. The challenge is that the gathered data may not directly address whatever research questions a researcher may have which necessitates using this data as a proxy for a construct or rephrasing research questions to align with what the gathered data can answer. Gathered data is also referred to as big data or organic data.

Designed data

Designed data is data that was developed and collected for a specific research purpose. Often this data is collected from people or establishments for answering scientifically designed research questions. A common way of collecting this form of data is the use of a survey and these surveys can be conducted in-person, online, and or over the phone. These forms of data collection are in contrast to gathered data which collects data passively and without human interaction. This leads to an important distinction in that gathered data is probably strictly quantitative because of its impersonal nature while designed data can be quantitative and or qualitative in nature because it is possible to have a human element in the collection process.

When a researcher wants designed data they will go through the process of conducting research which often includes developing a problem, purpose, research questions, and methodology. All of these steps are commonly involved in conducting research in general. The data that is collected for design purposes is then used to address the research questions of the study.

The purpose of this process is to ensure that the data collected will answer the specific questions the researcher has in mind. In other words, designed data is designed to answer specific research questions while gathered can hopefully answer some questions.


Understanding what data was collected for is beneficial for researchers because it helps them to be aware of the strengths and weaknesses the data may have based on its purpose. Neither gathered nor designed data is superior to the other. Rather, the difference is in what was the inspiration for collecting the data.

two gray bullet security cameras

Data Governance Office

The data governance office or team are the leaders in dealing with data within an organization. This team is comprised of several members such as

  • Chief Data Officer
  •  Data Governance Lead
  •  Data Governance Consultant
  •  Data Quality Analyst

We will look at each of these below. It also needs to be mentioned that a person might be assigned several of these roles which are particularly true in a smaller organization. In addition, it is possible that several people might fulfill one of these roles in a much larger organization as well.

Chief Data Officer

The chief data officer is responsible for shaping the overall data strategy at an organization. The chief data officer also promotes a data-driven culture and pushes for change within the organization. A person in this position also needs to understand the data needs of the organization in order to further the vision of the institution or company.


The role of the chief data officer encompasses all of the other roles that will be discussed. The chief data officer is essentially the leader of the data team and provides help with governance consulting, quality, and analytics. However, the primary role of this position is to see the big picture for big data and to guide the organization in this regard, which implies that technical skills are beneficial but leadership and change promotion is more critical. In sum, this is a challenging position that requires a large amount of experience

Data Governance Lead

The data governance leads primary responsibilities to involve defining policies and data governance frameworks. While the chief data officer is more of an evangelist or promoter of data governance the data governance lead is focused on the actual implementation of change and guiding the organization in this process.

Essentially, the data governance lead is in charge of the day-to-day operation of the data governance team. While the chief data officer may be the dreamer the data governance lead is a steady hand behind the push for change.

Data Governance Consultant

The data governance consultant is the subject matter expert in data governance. Their role is to know all the details of data governance in the general field and even better if they know how to make data governance happen in a particular discipline. For example, a data governance consultant who knows how to make data governance happen within the context of a university in particular.

The data governance consultant supports the data governance lead with implementation. In addition, the consultant is a go-between for the larger organization and IT. Serving as a go-between implies that the consultant is able to effectively communicate with both parties on a technical level with IT and in a layman’s matter with the larger organization. The synergy between IT and the larger organization can be challenging because of communications issues due to vastly different backgrounds and it is the consultant’s responsibility to bridge this gap.

Data Quality Analyst

The data quality analyst’s job is as the name implies to ensure quality data. One way of determining data quality is to develop rules for data entry. For example, a rule for data quality is that marital status can only be single, married, divorced, or widowed. This rule restricts any other option that people may want. When this rule is supported it is an example of high quilty within this context.

A data quality analyst also performs troubleshooting or root cause investigations. If something funny is going on in the data such as duplicates, it is the data quality analyst’s job to determine what is causing the problems and to find a solution. Lastly, a data quality analyst is also responsible for statistical work. This can include statistical work that is associated with the work of a data analyst and or statistical work that monitors the use of data and the quality of data within the organization.


The data governance team plays a critical role in supporting the organization with reliable and clean data that can be trusted to make actionable insights. Even though this is a tremendous challenge it is an important function in an organization.

interior of empty parking lot

Roles in Data Governance

Working with data is a team event. Different people are involved in different stages of the data process. The roles described below are roles commonly involved in data governance. The general order below is the common order in which these individuals will work with data. However, life is not always linear and different people may jump in at different times. In addition, one person might have more than one role when working with data in the governance process.

Data Owners

Data owners are responsible for the infrastructure such as the database in which data is stored for consumption and use. Data owners are also in charge of the allocation of resources related to the data. Data owners also play a critical role in developing standard operating procedures and compliance with these standards.

Data Producers

Once the database or whatever tool is used for the data the next role involved is the data producer. Data producers are responsible for creating data. The creation of data can happen through such processes as data entry or data collection. Data producers may also support quality control and general problem-solving of issues related to data. To make it simple the producer uses the system that the owner developed for the data.


Data Engineers

Data engineers are responsible for pipeline development which is moving data from one place to the other for various purposes. Data engineers deal with storage optimization and distribution. Data engineers also support the automation of various tasks. Essentially, engineers move around the data that producers create.

Data Custodians

Data custodians are the keepers and protectors of data. They focus on using the storage created by the data owner and the delivery of data like the data engineer. The difference is that the data custodian sends data to the people after them in this process such as stewards and analysts.

Data custodians also make sure to secure and back up the data. Lastly, data custodians are often responsible for network management.

Data Stewards

Data stewards work on defining and organizing data. These tasks might involve working with metadata in particular. Data students also serve as gatekeepers to the data which involves keeping track of who is using and accessing the data. Lastly, data stewards help consumers (analysts and scientists) find the data that they may need to complete a project.

Data Analysts

Data analysts as the name implies analyze the data. Their job can involve statistical modeling of data to make a historical analysis of what happened in the past. Data analysts are also responsible for cleaning data for analysis. In addition, data analysts are primarily responsible for data visualization and storytelling development of data. Dashboards and reports are also frequently developed by the data analyst.

Data Scientists

The role of a data scientist is highly similar to data analyst. The main difference is that data scientists use data to predict the future while data analysts use data to explain the past. In addition, data scientists serve as research designers to acquire additional data for the goals of a project. Lastly, data scientists do advance statistical work involving at times machine learning, artificial intelligence, and data mining.


The roles mentioned above all play a critical role in supporting data within an organization. When everybody plays their part well organizations can have much more confidence in the decisions they make based on the data that they have.

person holding white and black frame

Data Governance Framework Types and Principles

When it is time to develop data governance policies the first thing to consider is how the team views data governance. In this post, we will look at various data governance frameworks and principles to keep in mind when employing a data governance framework.


The top-down framework involves a small group of data providers. These data providers serve as gatekeepers for data that is used in the institution. Whatever data is used is controlled centrally in this framework.


One obvious benefit of this approach is that with a small group of people in charge, decision-making should be fast and relatively efficient. In addition, if something does go wrong it should be easy to trace the source of the problem. However, a top-down approach only works in situations that have small amounts of data or end users. When the amount of data becomes too large the small team will struggle to support users which indicates that this approach is hard to scale. Lastly, people may resent having to abide by rules that are handed down from above.


The bottom-up approach to data governance is the mirror opposite of the top-down approach. Where top-down involves a handful of decision-makers bottom-up focus is on a democratic style of data leadership. Bottom-up is scaleable due to everyone being involved in the process while top-down does not scale well. Generally, controls and restrictions on data are put in place after the raw data is shared rather than before when the bottom-up approach is used.

Like all approaches to data governance, there are concerns with the bottom-up approach. For example, it becomes harder to control the data when people are allowed to use raw data that has not been prepared for use. In addition, because of the democratic nature of the bottom-up approach, there is also an increased risk of security concerns because of the increased freedom people have.


The collaborative approach is a mix of top-down and bottom-up ideas on data governance. This approach is flexible and balanced while placing an emphasis on collaboration. The collaboration can be among stakeholders or between the gatekeepers and the users of data.

One main concern with this approach is that it can become messy and difficult to execute if principles and goals are not clearly defined. There it is important to spend a large amount of time in planning when choosing this approach.


Regardless of which framework you pick when beginning data governance. There are also several terms you need to be familiar with to help you be successful. For example, integrity involves maintaining open lines of communication and the sharing of problems so that an atmosphere of trust is maintained or developed.

It is also important to determine ownership for the purpose of governance and decision-making. Determining ownership also helps to find gaps in accountability and responsibility for data.

Leaders in data governance must also be aware of change and risk management. Change management is tools and process for communicating new strategies and policies related to data governance. Change management helps with ensuring a smooth transition from one state of equilibrium to another. Risk management is tools related to auditing and developing interventions for non-compliance.

A final concept to be aware of is strategic alignment. The goals and purpose of data governance must align with the goals of the organization that data governance is supporting. For example, a school will have a strict stance on protecting student privacy. Therefore, data governance needs to reflect this and support strict privacy policies


Frameworks provide a foundation on which your team can shape their policies for data governance. Each framework has its strengths and weaknesses but the point is to be aware of the basic ways that you can at least begin the process of forming policies and strategies for governing data at an organization.

white paper with note

Data Governance Framework

In this post we will look at a defining data governance framework. We will also look a the key components that are a part of a data governance framework.


A data governance framework is the how or the plan for governing the data within an organization. The term data governance determines what needs to be governed or controlled while the data governance framework is the actual plan for controlling the data.

Common Components

There are several common components of a data governance plan and they include the following.

  • Strategy
  •  Policies
  •  Processes
  •  Coordination
  •  Monitoring/communication
  •  Data literacy/culture

Strategy involves determining how data can be used to solve problems. This may seem pointless but certain data can be used to solve certain problems. For example, customers’ addresses in California might not be appropriate for determining revenue generated in Texas. When data is looked at strategically it helps to ensure that it is viewed as an asset in many cases by those who use it.


Policies help to guide such things as decision-making and expectations concerning data. In addition, policies also help with determining responsibilities and tasks related to data management. One example of policy in action is the development of standards which are rules for best practices in order to meet a policy. A policy may be something like protecting privacy. A standard to meet this policy would be to ensure that data is encrypted and password protected.

Process and technology involve steps for monitoring the quality of data. Other topics related to process can include dealing with metadata and data management. The proper process mainly helps with efficiency in the organization.

Coordination involves the processes of working together. Coordination can involve defining the roles and responsibilities for a complex process that requires collaboration with data. In other words, coordination is developed when multiple parties are involved with a complex task.

Progress monitoring involves the development of KPIs to make sure that the performance expectations are measured and adhered to. Progress monitoring can also involve issues related to privacy, quality, and compliance. An example of progress monitoring may be requiring everyone to change their password every 90 days. At the end of the 90 days, the system will automatically make the user create a new password.

Lastly, data literacy and culture involve training and developing the skill of analyzing and or communicating data to people and others within the organization of use or consumption data. Naturally, this is an ongoing process and how it works depends on who is involved.


A framework is a plan for achieving a particular goal or vision. As organizations work with data, they must be diligent in making sure that the data that is used is trustworthy and protected. A data governance framework is one way in which these goals can be attained.

settings android tab

Data Governance Benefits

Data governance is a critical part of many organizations today. In this post, we will look at some of the commonly found benefits of incorporating data governance into an organization.

Improved Data Quality

In theory, when data governance is implemented within an organization there should be a corresponding improvement in data quality. What is meant by improved data quality is better accuracy, consistency, and integrity. In addition, data quality can also include the completeness of the data and ensuring that the data is timely.


When data quality is high it allows end users to have greater trust in the analysis and conclusions that can be made from the data. Improved trust can also lead to an increase in confidence we sharing and or defending the decision-making process.

Risk Reduction

Data governance can also reduce risk. There are often laws that organizations have to follow concerning data governance. Common laws often include laws about privacy. When data governance is implemented and carefully enforced it can help in complying with laws and thus lower the risk of breaking laws and or facing legal consequences.

The typical organization probably does not want to deal with legal matters. As such, it is in most if not all organizations’ benefit to comply with laws through data governance. The process of abiding by laws also provides a good example to stakeholders and creates a culture of transparency.

Improved Decision-Making

Decisions are only as good as the information that they are based upon. If data is bad then it puts at risk the making of bad decisions. There is an idiom common in the data world which states “garbage in garbage out.” Therefore, it is critical that the data accurately represents what it is supposed to represent.

As mentioned earlier, good data leads to good decisions and increase confidence. It also helps with improving understanding of the context in which the data came from. 

Improved Processes

Data governance can also improve various processes. For example, roles relating to data have to be clearly defined. In addition, various tasks that need to be completed must also be stipulated and clarified. Whenever steps like these are taken it can improve the speed at which things are done.

In addition, improving processes can also reduce errors. Since people know what their role is and what they need to do it is easier to spot and prevent mistakes as the data moves to the various parties that are using it.

Customer service

Data governance is also beneficial for customer service or dealing with stakeholders. When requests are made by customers or stakeholders, accurate data is critical for addressing their questions. In addition, there are situations in which customers or stakeholders can access the data themselves. For example, most customers can at least access their own personal information on a shopping website such as Amazon.

If data is not properly cared for users cannot access it or have their questions answered. This is frustrating no matter what field or industry one is working for. Therefore, data governance is important in enhancing the experience of customers and people who work in the institution

Profit Up

A natural outcome of the various points mentioned above is increased profit or decreased expenses depending on the revenue model. When efficiency goes up and or customer satisfaction goes up there is often an increase in revenue.

What can be inferred from this is that data governance is not just a set of ideas to avoid headaches but a tool that can be employed to enhance profitability in many situations.


Data governance is beneficial in many more ways than mentioned here. For our purposes, data governance can allow an organization to focus on making cost-efficient, sound decisions by ensuring the quality and accuracy of the data involved in the process of making conclusions.

a man in maroon suit sitting at the table

Influences and Approaches of Data Governance

Data governance has been around for a while. As a result of this, there have been various trends and challenges that have influenced this field. in this post, we will look at several laws that have had an impact on data governance along with various concepts that have been developed to address common concerns.


Several laws have played a critical role in influencing data governance both in the USA and internationally. For example, the Sarbanes-Oxley (SOX) Act was enacted in 2002. The SOX act was created in reaction to various accounting scandals at the time and large corporations. Among some of the requirements of this law are setting standards for financial and corporate reporting and the need for executives to verify or attest that the financial information is correct. Naturally, this requires data governance to make sure that the data is appropriate so that these requirements can be met.


There are also several laws related to privacy in particular. Focusing again on the USA there is the Health Insurance Portability and Accountability (HIPAA) which requires institutions in the medical field to protect patient data. For leaders in data, they must develop data governance policies that protect medical information.

In the state of California, there is the California Consumers Protection Act (CCPA) which allows California residents more control over how their personal data is handled by companies. The CCPA is focused much more on the collection and selling of personal data as this has become a lucrative industry in the data world.

At the international level, there is the General Data Protection Regulation (GDPR). The GDPR is a privacy law that applies to anybody who lives in the EU. What this implies is that a company in another part of the world that has customers in the EU must abide by this law as well. As such, this is one example of a local law related to data governance that can have a global impact.

Various Concepts that Support Data Governance

Data governance was around much earlier than the laws described above. However, several different concepts and strategies were developed to address transparency and privacy as explained below.

Data classification and retention deals with the level of confidentiality of the data and policies for data destruction. For example, social security numbers is a form of data that is highly confidential while the types of shoes a store sells would probably not be considered private. In addition, some data is not meant to be kept forever. For example, consumers may request their information be removed from a website such as credit card numbers. In such a situation there must be a way for this data to be removed permanently from the system.

Data management is focused on consistency and transparency. There must be a master copy of data to serve as a backup and for checking the accuracy of other copies. In addition, there must be some form of data reference management to identify and map datasets through some general identification such as zip code or state.

Lastly, metadata management deals with data that describes the data. By providing this information it is possible to search and catalog data


Data governance will continue to be influenced by the laws and context of the world. With new challenges will be new ways to satisfy the concerns of both lawmakers and the general public.

white caution cone on keyboard

Data Governance

Data governance involves several concepts that describe the characteristics and setting in which the data is found. For people in leadership positions involving data, it is critical to have some understanding of the following concepts related to data governance. These concepts are

  • Ownership
  •  Quality
  •  Protection
  •  Use/Availability
  •  Management

Each of these concepts plays a role in shaping the role of data within an organization.


Data ownership is not always as obvious as it seems. One company may be using the data of a different company. It is important to identify who the data belongs to so that any rules and restrictions that the owner has about the use of the data are something that the user of the data is aware of.


Addressing details related to ownership helps to determine accountability as well. Identifying ownership can also identify who is responsible for the data because the owners will hopefully have an idea of who should be using the data. If not this is something that needs to be clarified as well.


Data quality is another self-explanatory term. Data quality is a way of determining how good the data is based on some criteria. One commonly used criterion for data quality is to determine the data’s completeness, consistency, timeliness, accuracy, and integrity.

Completeness is determining if everything that the data is supposed to capture is represented in the data set. For example, if income is one variable that needs to be in a dataset it is important to check that it is there.

Consistency is that the data that you are looking at is similar to other data in the same context. For example, student record data is probably similar regardless of the institutions. Therefore, someone with experience with student record data can tell you if the data you are looking at is consistent with other data in a similar context.

Timeliness has to do with the recency of the data. Some data is real-time while other data is historical. Therefore, the timeliness of the data will depend on the context of the project. A chatbot needs recent data while a study of incomes from ten years ago does not need data from yesterday.

Accuracy and integrity are two more measures of qualityu. Accuracy is how well the data represents the population. For example, a population of male college students should have data about male college students. Integrity has to do with the truthfulness of the data. For example, if the data was manipulated this needs to be explained.


Data protection has to do with all of the basic security concerns IT departments have to deal with today. Some examples include encryption and password protection. In addition, there may be a need to be aware of privacy concerns such as financial records or data collected from children.

There should also be awareness of disaster recovery. For example, there might be a real disaster that wipes out data or it can be an accidental deletion by someone. In either case, there should be backup copies of the data. Lastly, protection also involves controlling who has access to the data.


Despite the concerns of protection, data still needs to be available to the appropriate parties and this relates to data availability. Whoever is supposed to have the data should be able to access it as needed.

The data must also be usable. The level of usability will depend on the user. For example, a data analyst should be able to handle messy data but a consumer of dashboards needs the data to be clean and ready prior to use.


Data management is the implementation of the policies that are developed in the previous ideas mentioned. The data leadership team needs to develop processes and policies for ownership, quality, protection, and availability of data.

Once the policies are developed they have to actually be employed within the institution which can always be difficult as people generally want to avoid accountability and or responsibility, especially when things go wrong. In addition, change is always disliked as people gravitate towards the current norms.


Data governance is a critical part of institutions today given the importance of data now. IT departments need to develop policies and plans on the data in order to maintain trust in whatever conclusions are made from data.

Linking Plots in Plotly with R ViDEO

Linking plots involves allowing the action you take in one plot to affect another. Doing this can allow the user to uncover various patterns that may not be apparent otherwise. Using plotly, it is possible to link plots and this is shown in the video below.


Early Views on Criminology

In this post, we will look at some late 19th and early 20th-century views on criminology. In particular, we will look at the functionalist perspective and the Chicago School.

Functionalist Perspective

Emile Durkheim (1858-1917) was a major contributor to the functionalist perspective of criminology. This approach looks at crime in terms of the values and mores of society. For example, in most societies, certain values are preferred over others such as beliefs about roles in a family or about music and respect. In a similar line of thought, values of justice are preferred over values that encourage crime. What this means is that what is good is preferred over what is bad as defined by a group of people.


Deviance, as defined as breaking the rules and values of a society, serves an important function of defining what is right and wrong. Deviant behavior is an example of what is not acceptable and by seeing this negative behavior good is defined. For example, most cultures believe stealing is wrong and this behavior defines that asking for something and or giving something willingly is good.

Another tenet of the functionalist perspective is that fighting deviance helps to strengthen the cohesion and unity of a society. There are many examples of horrendous crime happening that galvanizes a community to pass laws to fight such abhorrent behavior. In other words, after something terrible happens society will rise up to make sure it never happens again and this only happens because deviant behavior had taken place.

On the flip side, if deviance is tolerated long enough it can help to establish no social norms. Many ideas that are supported and championed today were at one time or another considered deviant behavior. Views on reproductive rights, sexuality, and gender roles have faced tremendous pressure to change. Other proponents of behaviors that were once considered deviant have rallied to press their views into the forefront and place people who do not share their views on the defensive.

Chicago School and Crime

Another view of criminology that was developed around the same time as Durkheim’s work is the Chicago School perspective. This view was developed and encouraged by Robert Ezra Park (1964-1944). What was truly unique about this approach was its focus specifically on city life and crime that happens there.

Park explained that crime is worst in cities because of the structure of society. Cities encourage a higher degree of anonymity which can convince people they can get away with something without damaging any of their relationships. In addition, cities are more tolerant of diversity and thus deviance.

Park also shared the idea that crime can be found in certain areas of a city. This idea is based on how cities were designed. Certain parts of town were zoned in certain ways and industrial areas often will have more crime than suburban areas.

Crime is also considered a learned behavior. This idea was surprising for its time because many during Park’s era believed that people had a genetic predisposition to crime. For Park to place the blame on learning from others was a unique view.

Lastly, Park believe that agencies were the best defense against crime. For example, developing and funding government departments that deal with criminal behavior may be supported be Park.


The two views on criminology shared in this post provide insights into how researchers in the past viewed crime and its factors. Naturally, no single theory explains everything about a phenomenon. However, examining different theories helps a reader to understand the field of their studies.

a woman in black blazer shouting a man in handcuffed

Early Forms of Criminology

In this post, we will look at some of the first schools of thought on crime. The two schools, in particular, we will look at are classical and positivist criminology. Both of these schools of thought are still found in varying degrees in the modern era.

Classical Criminology

The classicist school of criminology dates back to the 18th century. Major influencers of this school of thought include Cesare Becarria and Jeremey Bentham. Classicism is also based heavily on ideas from the Enlightenment. For example, there is an assumption in Classicism that people are rational and free-willed and weigh the risk and rewards of actions. For criminologists, the assumption of rational thought implies that criminals go through a decision-making process before committing a crime. Therefore, if the deterrents are strong enough people will not choose to break the laws.


Social contract theory was another tenet of the classical school. Social contract theory states that people come together to make society work. In other words, people make agreements among themselves to abide by certain rules which implies that there is some form of a deterrent if people do not follow the rules.

A closely related idea to social contract theory is utilitarianism. Utilitarianism states that whatever is useful to most people should be implemented. Within the context of criminology, this means that laws that benefit the most people should be adopted and enforced.

Secularism is another critical component of classicism. Essentially, secularism within the context of criminology is based on the idea that man should make laws rather than God or the church. In other words, secularism seeks to push religious morals out of the criminal justice system. All forms of divine revelation and God’s law should not remove and reason should be the mechanism for right and wrong.

In terms of punishment, classic criminologists wanted to move from barbaric to more rational forms of punishment. During the 18th century, people were hanged, drawn and quartered, burned alive, tortured, mutilated, etc. For significant and even small crimes depending on their social rank. Classicists want to move to other forms of punishment such as imprisonment. Classists want the punishment to match the crime along with a measure of humanity in the method of correction.

Criticism of classicism includes the assumption that people are mostly rational in their decision-making process. People are often driven by emotions which is generally ignored in classicism. In addition, classicism absolves society from any role individuals play in breaking laws because it is assumed that society is fair and justice which is often not true.

Positivist Criminology 

The positivist school of criminology originated in the 19th century and was a reaction to the classicist school. Major proponents of this school included Cesare Lomroso, Enrici Ferri, and Francis Galton. The supporters of positivism use a scientific approach to addressing crime and its motives. Whereas classicism blamed the individual positivism would blame a person’s genetic makeup and or society as a whole.

Positivism has a deterministic view of crime based on the physical characteristics of an individual. For example, studies were done that would determine criminality by body type, the shape of the skull, and even the chromosomal makeup of people such as how many y chromosomes a person had. Men with XYY chromosomal were deemed more dangerous than individuals with only the more commonly found XY chromosome.

Since there was an emphasis on the appearance of the individual. It was commonly believed that criminals were different from society. Criminals are driven into crime outside of their own control. This implies a reduction in harsher sentences because people are generally not normal as they are committing crimes.

Positivism has its own problems. The traits found in criminals that are claimed to cause them to commit crimes are commonly found in the general public. In addition, it is difficult to establish causation just because a group of people all share similar traits and behavior. Lastly, positivism removes self-agency and the freedom of the individual to choose to do good or evil.


Understanding the origins of different ideas in a discipline can help an individual to appreciate the source of diversity of thought that is found in different places. Classicism and positivism serve different purposes in criminology. Each approach plays a critical role in shaping criminology in the world today.

police officer putting handcuffs on another person

Common Juvenile Dispositions and the Classroom

A disposition is the “punishment” for a youth who is found to be a delinquent which means that they committed some sort of criminal offense. In this post, we will look at several common dispositions and how they may affect the student and teacher. The ordering of these dispositions is generally from having the least impact on the student-teacher to the most impactful on the learning of the youth.

Low Impact

Low-impact dispositions do not affect the youth’s ability to study and continue to live a normal life.

Informal consent decree. An informal consent decree involves a youth and their parents agreeing to some form of a treatment program. If they agree to treatment there is no official disposition hearing which implies no formal record. However, the type of treatment agreed to can have varying impacts on the youth and their classroom experience. In general, the impact on learning is low because the judge and prosecutor were willing to accept this form of punishment for the youth’s actions.


Mandatory school attendance. This disposition is the result of truancy. If a child is not going to school such a requirement as going to class will make a positive impact in terms of getting the student back on track academically. Unfortunately, going to school and learning can be different things for students. In addition, it is possible that a student who is forced to go to school could be disruptive as well.

Financial restitution/Fines. Some courts require monetary compensation to the victim and or the state for whatever offense was committed. Generally, this will not impact the learning of the youth. exceptions might be if the financial compensation compels the youth to seek employment that affects their ability to study.

Medium Impact

The next few examples are examples that have a medium influence on the youth’s ability to study. This may be because of scheduling conflicts and or time commitments that may influence time for study.

Probation. Probation involves placing a youth under the supervision of an adult. The probation officer gives the youth certain rules to obey as a condition of the probation. The challenge when it comes to learning is that at times the youth may have to miss school for probation meetings and or the probation officer may want to speak with the teachers of the youth.

Home Detention. Home detention requires that the youth be at home at certain times. Normally, school is allowed, however, the problem is how the probation officer knows where the youth is and this often involves the use of an electronic monitoring device. This device often goes around the ankle. They can be disruptive if other students want to look at the device on the youth’s leg. This can lead to a group of kids gathering around and staring and laughing at the youth with the device on. In other words, if the student lets everyone know there wearing the device it could lead to disruptions in class until the novelty wears off.

Community service. Community service involves work in the community as restitution. Such a punishment is not too disruptive. However, the time given to community service will detract from the time given to study.

Outpatient psychotherapy. This treatment is for youth with clear psychological disorders. For youth with mental health issues, school is often not a problem given the other challenges in life. In addition, the teacher also needs to be careful not to destabilize the youth through insensitive actions.

Drug and alcohol treatment. Like psychotherapy, a youth who needs drug and alcohol treatment probably is not in a position to focus on their studies.

High Impact

All of the examples in this section involve the youth being removed from their home. As such, the child may no longer even be enrolled in the school that they use to attend.

Commitment to residential community program. Commitment to a residential community program involves the youth being removed from the home but placed in a community program that should be close to home. How close depends on the available services. A youth from a rural setting will potentially be further from home than an urban youth. In other cases, it is possible that the youth will no longer be in attendance at their regular school.

Commitment to secure treatment. Secure treatment is a euphemism for a child being placed essentially in prison. In other words, their freedom has been taken from them for a period of time. In such a situation, a student will no longer continue in their prior academic efforts until they complete their sentence. While committed, the youth will participate in academic work provided by the facility.

Foster home placement. Placement in a foster home involves the child being removed from their own home into the home of strangers who are supposed to provide a family environment for the youth. Off course, this is highly disruptive and can involve the child moving anywhere in the state of their residency. As such, this can be highly difficult for the education and learning of the youth.


When youth make mistakes there are often punishments involved. The example shared here are just some of the ways a youth can have their freedoms at least partially altered because of bad choices. Naturally, the choices students make can also have impacts on their learning and interaction with their teacher.

black android smartphone on top of white book

Brief History of Policing and Juveniles

This post will provide a brief overview of policing as found in England and the USA. There will also be a link to juvenile justice.

Early forms of Policing

Policing has changed a great deal over time in England. Some of the earliest forms of policing was the pledge system which took place before the Norman conquest. In the pledge system, neighbors would work together to protect each other’s property from thieves and other criminal threats. In addition, the pledge system was also supposed to deal with problems among neighbors which implies that people were expected to police themselves.


Naturally, the pledge system has its limitations. Its strength was dependent on the strength of the community which may not have been enough to deal with serious threats. In addition, the pledge system could be politically messy as self-polic9ing would be based on who had power and or charisma rather than right and wrong. However, the pledge system did allow for a great deal of autonomy for those involved.

The next system of policing that moved through England was the watch system. The watch system was designed for urban and densely populated areas. Men were organized in their parish to patrol their communities at night. Eventually, England turned away from citizen-led policing to professionals. The professionalization of law enforcement led to the development of such positions as the constable for policing and the justice of the peace for judicial matters.

With the continued development of the Industrial Revolution further development of law enforcement was also needed. The need for better policing led to the creation of the bobbies. However, this next step in crime fighting was not successful in stopping crime and was often influenced by money and politics.

United States

In the US the sheriff was the equivalent of a constable or bobby. But in the mid-19th century, police departments were already being established. During this time juvenile offenders were treated just like adults if they were accused of committing a crime. With time reforms ended this practice but law enforcement professionals continued to share concerns about the lack of treatment of young people.

In many departments today, the police will have either a juvenile officer or a unit that focuses on dealing with youths. There is also an emphasis in some jurisdictions on community policing which involves reducing people’s fear of law enforcement through decentralization of decision-making and community involvement.

When dealing with juveniles there is a great deal of discretion in how to handle each child’s situation. The focus is always on treatment rather than punishment in many places. What this means is that the police officer can choose or not choose to “arrest” a juvenile in many different situations. The same idea applies to the probation officer and prosecutor who determine whether or not to pursue the arrest for punishment or disposition and this extends to the judge as well. The flexibility of discretion has led to accusations of unfairness and even racism.

Another problem with discretion is the inconsistencies in how police approach youth. There have been court cases over such matters as search and seizure, Miranda rights, habeas corpus, and other technical legal matters because law enforcement did not carefully consider the constitutional rights of children because it has often been assumed that children do not have these rights.


Policing has seen a great deal of growth and development over the years. Despite the flaws in the system law enforcement is still dedicated to helping keep people safe. Evidence for this can be seen in the reforms that have been made to maintain the trust of the communities in which they serve.

hallway with window

Child Savers

During the 19th century in the United States that was a huge influx of people into urban areas in search of jobs and other opportunities. With this change in how people lived, there was a change in the family as well. Before the growth of urban centers, parents and children were all together in a rural setting in a farm-like community and this helped to monitor and control a child’s behavior. Now, parents often would have to leave their children unattended for long periods while they worked. Children who are left unattended tend to get into trouble. Over time and with years of neglect continuing some of these youth became delinquents. Once children began to turn to crime the local government began to step in and try and deal with this problem.

One solution that was tried was developed by a group of juvenile reformers from the Child Saving movement. In this post, we will look at the history and beliefs of the members of this movement.

Child Saving Movement

The Child Saving Movement wanted the government to monitor and control the activities of wayward youth. As mentioned before, this used to be a responsibility of the family but there was a breakdown in the family as a result of living in the new conditions of dense city life.

To help delinquent youths proponents of the Child Saving movement developed the House of Refuge, which was an early form of a reform school. Wayward youth were sent here for status offenses to major crimes. Before this youth were often sent to adult prisons for the offenses they committed.

The House of Refuge was funded primarily privately. However, the irony of the House of Refuge is that funding came from the state of New York, and this funding involved taxes collected from bars, theaters, and even circuses. In other words, venues that contributed to delinquency were used to reform students who were delinquents.

The House of Refuge was opened in 1825 with only 6 youths. Within the first ten years, the facility would serve 1600 youths. Both boys and girls were housed at the facility. Boys were taught blue-collar skills such as skills found in woodshop. Female residents were taught skills related to the home such as cooking and sewing.

The original location of the House of Refuge was in the city. However, with time the facility was moved to a rural location. In all, the House of Refuge would last about 100 years well into the early 20th century. Among the main criticism of this approach was that the facility was trying to play the role of the parent through the use of strict discipline and long work hours.


There are strengths and weaknesses to all forms of reform for young people who become delinquents. There is always something wrong and something that is done well by most movements. What all reform movements have in common is a desire to help young people and to make society safer. Sometimes it might be better to focus on this rather than on the failures of various movements such as the Child Savers.