Here are five basic ways hate groups use the internet to get their messages out and stir up violence.
1. Social networking
Mainstream social networking outlets such as Twitter and Facebook have struggled with how to handle hate groups on their platforms. These sites often find themselves trying to balance the right to share and debate ideas with the responsibility to protect society against potential attacks.
Facebook said it has its own internal guidelines about what constitutes a hate group.
Simply being white supremacists or identifying as "alt-right" doesn't necessarily qualify. A person or group must threaten violence, declare it has a violent mission or actually take part in acts of violence.iPhone transfer software
Like adult content and graphic violence, the content will be blurred and users will need to manually opt in to view. But Twitter didn't detail what it considers to be a hate symbol.
In late 2017, people used web platforms including bulletin board sites Reddit and 4chan to post about the "Pizzagate" false conspiracy theory that top Democrats including Hillary Clinton ran a child sex operation out of a D.C. pizzeria. The false story spread to other platforms and websites. Pizzeria owner James Alefantis received death threats. Eventually a man with an assault rifle showed up at the pizzeria and fired a shot before he was apprehended. No one was hurt.
Although tech companies feel increasing pressure to police speech on their platforms, if they over-correct or ban speech too broadly, they risk losing customers, said David Snyder, executive director at the First Amendment Coalition to CNNMoney.
2. Video platforms
"... (W)e will be taking a tougher stance on videos that do not clearly violate our policies — for example, videos that contain inflammatory religious or supremacist content. In future these will appear behind an interstitial warning and they will not be monetized, recommended or eligible for comments or user endorsements."
YouTube responded to CNN's investigation with a written statement: "When we find that ads mistakenly ran against content that doesn't comply with our policies, we immediately remove those ads. We know that even when videos meet our advertiser friendly guidelines, not all videos will be appropriate for all brands. But we are committed to working with our advertisers and getting this right."
3. Online funding
Without the convenience of using the internet to raise funds, many hate groups would be crippled.
The Southern Poverty Law Center, a nonprofit that monitors hate groups in the US, said organizers, speakers and individuals attending last year's Charlottesville rally used PayPal to move money ahead of the event.
"If we become aware of a website or organization using our services that may violate our policies, our highly trained team of experts addresses each case individually and carefully evaluates the website itself, any associated organizations, and their adherence to our policy," PayPal said in the blog post.
Popular crowdfunding site GoFundMe also took a stand against hate speech following Charlottesville. The platform shut down multiple campaigns to raise money for James Fields, the man accused of driving his car into a crowd at the rally, killing one woman and injuring dozens more.
The company said those campaigns did not raise any money and were immediately removed.
Some hate groups get around traditional funding sites by using alt-right-focused fundraising platforms and cryptocurrency. Cryptocurrency a relatively new kind of currency for the digital era. It works across international borders and doesn't need to be backed by banks or governments.
Websites are a basic piece of the hate propaganda machine. But Charlottesville may have made it more difficult for white supremacist and Neo-Nazi websites to remain online.
But getting kicked off a web host merely forces a site to go somewhere else — ultimately prompting a game of internet-domain whack-a-mole. It also raises issues around what domain-hosting companies are responsible for, and where they draw the line on objectionable material.
"Legally, they don't have any responsibility around this, unless it's a federal crime (such as child pornography) or intellectual property," Daphne Keller, the director of intermediary liability at the Stanford Center for Internet and Society, told CNN Tech last August.
5. The dark web
If you've never heard of the dark web, you should be made aware. The dark web is a part of the internet that can't be searched by Google or most common search engines. It can only be viewed with a special Tor browser.
Hate groups may find less policing on the dark web, but because relatively few people use the dark web, the potential audience available is very small.
Why haven't tech companies done more to combat hate groups online?
CNN Digital's Sara Sidner, Mallory Simon, Kaya Zurief, Sara Ashley O'Brien, Selena Larson, Danielle