Broken Windows Theory of Policing to Reduce Online Crime

Exploring Ways that Online Platforms Have Neglected Their Users and Not Consistently Enforcement Their Own Rules

Primary Category: Online Criminology

Author:
•  Tim McGuinness, Ph.D. – Anthropologist, Scientist, Director of the Society of Citizens Against Relationship Scams Inc.

About This Article

The “Broken Windows” theory of policing, which emphasizes addressing minor infractions to prevent more serious crime, can be applied to combat online crime by maintaining order in digital spaces.

As online crime has surged in recent years, platforms need to focus on strict enforcement of rules against small offenses like spam, fake accounts, and minor cyber threats to prevent escalation into larger issues like identity theft, phishing, or hacking. By proactively monitoring activity, empowering users to report suspicious behavior, and fostering a secure online environment, platforms can create a culture of accountability.

Swift action against minor misconduct, collaboration with cybersecurity experts, and maintaining visible security measures can help prevent serious cybercrime and establish trust among users.

The Theory of Broken Windows in Policing Can Help With Reduce Online Crime

For the last two and a half decades, we have seen a continuous increase in online crime. The only real reductions took place in 2017 through 2019, but since then it has been growing at 70+% per year.

A theory that sheds some light on this is the 'Broken Window' theory of policing.

What is the Broken Windows Theory of Policing?

The "Broken Windows" theory of policing, developed by social scientists James Q. Wilson and George L. Kelling in the early 1980s, suggests that visible signs of disorder and neglect, such as broken windows, graffiti, or public loitering, can lead to an increase in crime. The idea is that if minor issues are not addressed, it creates an environment where more serious crimes are more likely to occur.

Key principles of the theory:

  • Disorder Breeds Crime: Small signs of neglect, like a broken window, signal that no one is taking care of the property or neighborhood. This perceived lack of oversight can encourage further disorder, leading to more serious criminal activity.

  • Prevention Through Maintenance: By addressing minor offenses like vandalism, public intoxication, or fare evasion, police and communities can maintain order and prevent an escalation into more serious crimes. The theory supports the idea that maintaining a well-ordered environment helps prevent crime from taking root.

  • Community Involvement: It also emphasizes the role of community members in keeping neighborhoods safe. When people actively care for and maintain their surroundings, it can create a sense of shared responsibility, discouraging criminal behavior.

Critics of the theory argue that it has sometimes led to over-policing and disproportionately affects marginalized communities. Nonetheless, it has influenced policing strategies like "zero-tolerance" approaches, particularly in cities like New York in the 1990s.

Understanding 'Broken Windows' in the Online Space

The "Broken Windows" theory can be applied to online crime by emphasizing the importance of addressing minor forms of disorder or misconduct in digital spaces to prevent more significant cybercrimes.

Here’s how the theory could be adapted for the online environment:

Addressing Minor Online Offenses

In the online world, small infractions like spam emails, fake social media accounts, trolling, or the posting of inappropriate content are the digital equivalent of physical "broken windows." If these are left unaddressed, they signal that the platform or site is poorly monitored, encouraging malicious actors to feel emboldened to escalate their activities.

      • Examples of Minor Infractions: Fake social media profiles, bots that post spam or deceptive links, minor instances of phishing, low-level hacking (e.g., testing vulnerabilities), or cyberbullying. When platforms don’t actively moderate these behaviors, users may perceive the site as insecure or unsafe, and criminal elements may use the opportunity to further exploit vulnerabilities.

      • Prevention Through Early Intervention: Proactively removing fake accounts, filtering spam, and penalizing bad behavior discourages more harmful criminal activities like online scams, ransomware attacks, and identity theft. For instance, quick removal of spam or phishing attempts from email services may stop users from falling victim to larger-scale fraud.

 Maintaining Digital Order

Maintaining order in digital spaces is crucial in preventing serious cybercrime. Just as a neighborhood that is kept clean and orderly deters physical crime, a well-moderated online environment discourages illegal activities.

      • Moderation and Enforcement: Social media platforms, websites, and digital services need consistent and visible moderation policies that are actively enforced. This includes flagging and removing harmful content, responding quickly to reports of misconduct, and taking swift action against offenders. When users see that the digital space is being actively managed, it deters them from engaging in malicious behavior, just as a well-kept environment discourages vandalism or theft.

      • Platform Responsibility: Online platforms can enhance their security by investing in AI-driven tools to detect and remove harmful content (e.g., scams, fake news, deepfakes) and by employing dedicated moderators to enforce community standards. By doing so, platforms signal that criminal behavior is not tolerated, reducing the likelihood of more serious offenses such as data breaches or hacking campaigns.

 Community Engagement and User Accountability

A key aspect of the "Broken Windows" theory is community involvement in maintaining order. Online, this translates to engaging users in the process of identifying and reporting harmful behavior, thereby creating a collaborative effort to maintain the safety of digital spaces.

      • User Empowerment: Encouraging users to report suspicious behavior, fake profiles, and potential scams strengthens the platform’s ability to respond quickly. When users feel empowered to take an active role in maintaining the integrity of the digital space, it reinforces a culture of accountability.

      • Creating a Safe Digital Community: Platforms that involve their users in policing the space, through clear reporting mechanisms and feedback loops, foster a safer online community. When users report bad actors, and see swift action taken, it dissuades potential criminals from exploiting the platform. Similarly, a culture of accountability encourages users to act responsibly, knowing that their peers are also keeping an eye out for misconduct.

 Preventing Escalation of Online Crime

The Broken Windows theory suggests that ignoring small infractions leads to larger crimes. In the online world, allowing minor cybercrimes to go unchecked can lead to the escalation of more serious offenses.

      • Phishing and Scams: For example, a minor phishing attempt that goes unaddressed can embolden cybercriminals to scale up their operations, targeting more individuals or organizations. A hacker who faces little resistance may proceed from probing a website’s vulnerabilities to conducting a full-scale data breach or ransomware attack.

      • Financial Fraud: Early intervention in cases of financial fraud, such as small unauthorized charges on a user’s credit card, can prevent more significant fraud from occurring. If such attempts are quickly blocked and reported, it disrupts the criminal’s ability to escalate to more serious theft or identity fraud.

 Creating a Culture of Security and Trust

Platforms that actively maintain digital order create a culture of security and trust, making it less likely that serious cybercrimes will flourish. When users feel confident that a platform is safe and well-regulated, they are more likely to engage, while criminals are deterred by the platform’s reputation for strict enforcement.

      • Visible Security Measures: Just as a well-kept neighborhood discourages crime, visible digital security measures, such as two-factor authentication, encryption, and regular updates, signal that the platform takes cybersecurity seriously. Users are less likely to become victims of scams, and criminals are less likely to target a space that has strong defenses.

      • Building Trust in Digital Transactions: Platforms that prioritize security foster greater trust among users, especially in online financial transactions. E-commerce sites or payment processors that visibly address even minor fraud cases will be seen as more trustworthy, which can reduce the risk of serious financial crimes like identity theft or credit card fraud.

Setting a Precedent

One of the most critical elements of the Broken Windows theory is that addressing small problems prevents them from becoming large problems. In the online world, this means that by cracking down on small cybercrimes, platforms can set a precedent that discourages more significant threats.

      • Zero Tolerance for Malicious Behavior: If a platform takes a strong stance against small infractions, such as disinformation campaigns or minor hacking attempts, it sends a clear message that larger attacks will not be tolerated. This can prevent criminal networks from attempting more dangerous exploits on the platform.

      • Deterrence Through Swift Action: Swift and visible action against small offenses has a ripple effect, deterring others from committing similar or more serious crimes. Criminals are less likely to target platforms or users who actively report and address even minor violations of rules.

9 Recommendations for Online Platforms

Here is a series of recommendations that online platforms can implement to apply the Broken Windows theory to controlling online crime:

1. Strict Enforcement of Rules Against Minor Violations

      • Develop Clear Community Guidelines: Ensure that community guidelines are transparent, comprehensive, and prominently displayed. Outline what constitutes minor infractions such as spam, fake accounts, or inappropriate content.

      • Zero Tolerance for Minor Offenses: Actively enforce penalties for small offenses like trolling, harassment, fake profiles, and low-level hacking. Swift action sends a message that no level of misconduct will be tolerated.

      • Automated Moderation Tools: Invest in AI and machine learning algorithms to detect and remove spam, phishing attempts, and harmful content in real time.

2. Proactive Monitoring and Maintenance of Digital Spaces

      • Routine Platform Audits: Regularly review platform activity for signs of disorder, such as high levels of unaddressed spam, fake accounts, or dormant user reports. This proactive approach ensures that minor issues don’t grow unchecked.

      • Immediate Removal of Malicious Content: Implement systems to swiftly take down harmful or fraudulent content, including hate speech, scams, or misinformation, before it can spread and cause greater harm.

3. Empower and Engage the User Community

      • Encourage User Reporting: Make it easy for users to report suspicious activity, such as fake accounts, phishing attempts, or inappropriate content. Provide clear reporting mechanisms on every page or within every interaction.

      • Reward User Vigilance: Offer small incentives for users who actively help maintain order on the platform, such as "trusted user" badges, or other recognitions for contributing to platform safety.

      • Real-Time Response to User Reports: Ensure that reports from users are promptly addressed, and provide feedback loops so that users know their reports lead to action.

4. Prioritize Early Intervention to Prevent Escalation

      • Block and Ban Repeat Offenders: Develop strict policies for users who repeatedly violate rules, even if their infractions are minor. Early intervention helps prevent them from progressing to larger, more damaging crimes.

      • Identify and Shut Down Fraudulent Networks: Monitor patterns of behavior that indicate the presence of organized fraud networks, and take steps to dismantle them before they escalate to significant cybercrime.

5. Foster a Secure and Trustworthy Digital Environment

      • Visible Security Measures: Promote visible security features like two-factor authentication, encryption, and regular updates to remind users that the platform takes safety seriously.

      • Educate Users on Cybersecurity: Provide resources and training for users to help them identify and report phishing, scams, and other online threats. An educated user base is more likely to catch minor problems early.

      • Regular Security Audits: Conduct frequent audits of the platform’s security infrastructure to ensure that small vulnerabilities are addressed before they can be exploited.

6. Invest in Scalable and Real-Time Solutions

      • Real-Time Moderation: Use automated systems that can flag inappropriate or harmful content immediately for review by human moderators, ensuring fast response times to minor offenses.

      • Adaptive Algorithms: Implement algorithms that adapt to emerging threats, such as evolving scam techniques or new forms of phishing, so that the platform stays ahead of potential cybercrime.

7. Set a Strong Precedent with Visible Actions

      • Publicize Enforcement Actions: Make enforcement actions visible to users by sharing updates about significant crackdowns on scammers, hackers, or malicious networks. This creates a deterrent effect by showing that violations are not tolerated.

      • Transparency in Penalties: Clearly communicate the penalties for small infractions, such as temporary bans or account suspensions, and enforce them consistently. This reinforces the notion that all offenses, even small ones, are taken seriously.

8. Collaboration with Law Enforcement and Cybersecurity Experts

      • Partner with Cybersecurity Firms: Work with cybersecurity experts to help identify and respond to emerging threats before they become widespread on the platform.

      • Collaborate with Law Enforcement: Develop partnerships with law enforcement agencies to ensure that major offenders can be prosecuted, and provide support to investigate and dismantle criminal networks operating online.

9. Create Safe Zones for Vulnerable Users

      • Dedicated Spaces for Vulnerable Groups: Establish safe spaces for vulnerable users (e.g., seniors, newcomers to the internet) where stricter moderation helps protect them from predatory behavior or scams.

      • Advanced Privacy Controls: Implement user-friendly privacy settings that allow users to control who can interact with them, helping to prevent unwanted contact and reduce opportunities for scams.

By implementing these recommendations, online platforms can create a safer, more secure environment by applying the principles of the Broken Windows theory to prevent small-scale misconduct from escalating into serious online crime.

Conclusion

Applying the "Broken Windows" theory to online crime underscores the importance of addressing small-scale digital offenses to prevent more significant cybercrimes. By focusing on early intervention, maintaining digital order, involving users in policing the space, and setting a strong precedent for enforcement, platforms can create safer, more secure online environments. This approach not only deters criminal behavior but also fosters a culture of trust and responsibility, benefiting both users and the broader digital community.

Please Leave Us Your Comment
Also, tell us of any topics we might have missed.

Leave a Reply

Your comments help the SCARS Institute better understand all scam victim/survivor experiences and improve our services and processes. Thank you

Your email address will not be published. Required fields are marked *

Thank you for your comment. You may receive an email to follow up. We never share your data with marketers.

Recent Reader Comments

Did you find this article useful?

If you did, please help the SCARS Institute to continue helping Scam Victims to become Survivors.

Your gift helps us continue our work and help more scam victims to find the path to recovery!

You can give at donate.AgainstScams.org

Important Information for New Scam Victims

If you are looking for local trauma counselors please visit counseling.AgainstScams.org or join SCARS for our counseling/therapy benefit: membership.AgainstScams.org

If you need to speak with someone now, you can dial 988 or find phone numbers for crisis hotlines all around the world here: www.opencounseling.com/suicide-hotlines

Statement About Victim Blaming

Some of our articles discuss various aspects of victims. This is both about better understanding victims (the science of victimology) and their behaviors and psychology. This helps us to educate victims/survivors about why these crimes happened and to not blame themselves, better develop recovery programs, and to help victims avoid scams in the future. At times this may sound like blaming the victim, but it does not blame scam victims, we are simply explaining the hows and whys of the experience victims have.

These articles, about the Psychology of Scams or Victim Psychology – meaning that all humans have psychological or cognitive characteristics in common that can either be exploited or work against us – help us all to understand the unique challenges victims face before, during, and after scams, fraud, or cybercrimes. These sometimes talk about some of the vulnerabilities the scammers exploit. Victims rarely have control of them or are even aware of them, until something like a scam happens and then they can learn how their mind works and how to overcome these mechanisms.

Articles like these help victims and others understand these processes and how to help prevent them from being exploited again or to help them recover more easily by understanding their post-scam behaviors. Learn more about the Psychology of Scams at www.ScamPsychology.org

SCARS Resources:

Psychology Disclaimer:

All articles about psychology and the human brain on this website are for information & education only

The information provided in this and other SCARS articles are intended for educational and self-help purposes only and should not be construed as a substitute for professional therapy or counseling.

Note about Mindfulness: Mindfulness practices have the potential to create psychological distress for some individuals. Please consult a mental health professional or experienced meditation instructor for guidance should you encounter difficulties.

While any self-help techniques outlined herein may be beneficial for scam victims seeking to recover from their experience and move towards recovery, it is important to consult with a qualified mental health professional before initiating any course of action. Each individual’s experience and needs are unique, and what works for one person may not be suitable for another.

Additionally, any approach may not be appropriate for individuals with certain pre-existing mental health conditions or trauma histories. It is advisable to seek guidance from a licensed therapist or counselor who can provide personalized support, guidance, and treatment tailored to your specific needs.

If you are experiencing significant distress or emotional difficulties related to a scam or other traumatic event, please consult your doctor or mental health provider for appropriate care and support.

Also read our SCARS Institute Statement about Professional Care for Scam Victims – click here

If you are in crisis, feeling desperate, or in despair please call 988 or your local crisis hotline.

PLEASE NOTE: Psychology Clarification

The following specific modalities within the practice of psychology are restricted to psychologists appropriately trained in the use of such modalities:

  • Diagnosis: The diagnosis of mental, emotional, or brain disorders and related behaviors.
  • Psychoanalysis: Psychoanalysis is a type of therapy that focuses on helping individuals to understand and resolve unconscious conflicts.
  • Hypnosis: Hypnosis is a state of trance in which individuals are more susceptible to suggestion. It can be used to treat a variety of conditions, including anxiety, depression, and pain.
  • Biofeedback: Biofeedback is a type of therapy that teaches individuals to control their bodily functions, such as heart rate and blood pressure. It can be used to treat a variety of conditions, including stress, anxiety, and pain.
  • Behavioral analysis: Behavioral analysis is a type of therapy that focuses on changing individuals’ behaviors. It is often used to treat conditions such as autism and ADHD.
    Neuropsychology: Neuropsychology is a type of psychology that focuses on the relationship between the brain and behavior. It is often used to assess and treat cognitive impairments caused by brain injuries or diseases.

SCARS and the members of the SCARS Team do not engage in any of the above modalities in relationship to scam victims. SCARS is not a mental healthcare provider and recognizes the importance of professionalism and separation between its work and that of the licensed practice of psychology.

SCARS is an educational provider of generalized self-help information that individuals can use for their own benefit to achieve their own goals related to emotional trauma. SCARS recommends that all scam victims see professional counselors or therapists to help them determine the suitability of any specific information or practices that may help them.

SCARS cannot diagnose or treat any individuals, nor can it state the effectiveness of any educational information that it may provide, regardless of its experience in interacting with traumatized scam victims over time. All information that SCARS provides is purely for general educational purposes to help scam victims become aware of and better understand the topics and to be able to dialog with their counselors or therapists.

It is important that all readers understand these distinctions and that they apply the information that SCARS may publish at their own risk, and should do so only after consulting a licensed psychologist or mental healthcare provider.

Opinions

The opinions of the author are not necessarily those of the Society of Citizens Against Relationship Scams Inc. The author is solely responsible for the content of their work. SCARS is protected under the Communications Decency Act (CDA) section 230 from liability.

Disclaimer:

SCARS IS A DIGITAL PUBLISHER AND DOES NOT OFFER HEALTH OR MEDICAL ADVICE, LEGAL ADVICE, FINANCIAL ADVICE, OR SERVICES THAT SCARS IS NOT LICENSED OR REGISTERED TO PERFORM.

IF YOU’RE FACING A MEDICAL EMERGENCY, CALL YOUR LOCAL EMERGENCY SERVICES IMMEDIATELY, OR VISIT THE NEAREST EMERGENCY ROOM OR URGENT CARE CENTER. YOU SHOULD CONSULT YOUR HEALTHCARE PROVIDER BEFORE FOLLOWING ANY MEDICALLY RELATED INFORMATION PRESENTED ON OUR PAGES.

ALWAYS CONSULT A LICENSED ATTORNEY FOR ANY ADVICE REGARDING LEGAL MATTERS.

A LICENSED FINANCIAL OR TAX PROFESSIONAL SHOULD BE CONSULTED BEFORE ACTING ON ANY INFORMATION RELATING TO YOUR PERSONAL FINANCES OR TAX-RELATED ISSUES AND INFORMATION.

SCARS IS NOT A PRIVATE INVESTIGATOR – WE DO NOT PROVIDE INVESTIGATIVE SERVICES FOR INDIVIDUALS OR BUSINESSES. ANY INVESTIGATIONS THAT SCARS MAY PERFORM IS NOT A SERVICE PROVIDED TO THIRD-PARTIES. INFORMATION REPORTED TO SCARS MAY BE FORWARDED TO LAW ENFORCEMENT AS SCARS SEE FIT AND APPROPRIATE.

This content and other material contained on the website, apps, newsletter, and products (“Content”), is general in nature and for informational purposes only and does not constitute medical, legal, or financial advice; the Content is not intended to be a substitute for licensed or regulated professional advice. Always consult your doctor or other qualified healthcare provider, lawyer, financial, or tax professional with any questions you may have regarding the educational information contained herein. SCARS makes no guarantees about the efficacy of information described on or in SCARS’ Content. The information contained is subject to change and is not intended to cover all possible situations or effects. SCARS does not recommend or endorse any specific professional or care provider, product, service, or other information that may be mentioned in SCARS’ websites, apps, and Content unless explicitly identified as such.

The disclaimers herein are provided on this page for ease of reference. These disclaimers supplement and are a part of SCARS’ website’s Terms of Use

Legal Notices: 

All original content is Copyright © 1991 – 2023 Society of Citizens Against Relationship Scams Inc. (Registered D.B.A SCARS) All Rights Reserved Worldwide & Webwide. Third-party copyrights acknowledge.

U.S. State of Florida Registration Nonprofit (Not for Profit) #N20000011978 [SCARS DBA Registered #G20000137918] – Learn more at www.AgainstScams.org

SCARS, SCARS|INTERNATIONAL, SCARS, SCARS|SUPPORT, SCARS, RSN, Romance Scams Now, SCARS|INTERNATION, SCARS|WORLDWIDE, SCARS|GLOBAL, SCARS, Society of Citizens Against Relationship Scams, Society of Citizens Against Romance Scams, SCARS|ANYSCAM, Project Anyscam, Anyscam, SCARS|GOFCH, GOFCH, SCARS|CHINA, SCARS|CDN, SCARS|UK, SCARS|LATINOAMERICA, SCARS|MEMBER, SCARS|VOLUNTEER, SCARS Cybercriminal Data Network, Cobalt Alert, Scam Victims Support Group, SCARS ANGELS, SCARS RANGERS, SCARS MARSHALLS, SCARS PARTNERS, are all trademarks of Society of Citizens Against Relationship Scams Inc., All Rights Reserved Worldwide

Contact the legal department for the Society of Citizens Against Relationship Scams Incorporated by email at legal@AgainstScams.org