w.techdirt.comTechdirt.

w.techdirt.com Profile

w.techdirt.com

Maindomain:techdirt.com

Title:Techdirt.

Description:Jul 14 2020 · Oh copyright troll Richard Liebowitz is at it again Last month we wrote about an absolutely massive benchslap he received as a judge detailed

Discover w.techdirt.com website stats, rating, details and status online.Use our online tools to find owner and admin contact info. Find out where is server located.Read and write reviews or vote to improve it ranking. Check alliedvsaxis duplicates with related css, domain relations, most used words, social networks references. Go to regular site

w.techdirt.com Information

Website / Domain: w.techdirt.com
HomePage size:164.589 KB
Page Load Time:0.187382 Seconds
Website IP Address: 209.41.68.26
Isp Server: Isparks Inc

w.techdirt.com Ip Information

Ip Country: United States
City Name: Lindon
Latitude: 40.346183776855
Longitude: -111.73179626465

w.techdirt.com Keywords accounting

Keyword Count

w.techdirt.com Httpheader

Date: Thu, 16 Jul 2020 02:19:57 GMT
Content-Type: text/html; charset=utf-8
Transfer-Encoding: chunked
Connection: keep-alive
Set-Cookie: __cfduid=da841af041495675245a048c5f5b68de61594865997; expires=Sat, 15-Aug-20 02:19:57 GMT; path=/; domain=.techdirt.com; HttpOnly; SameSite=Lax
Strict-Transport-Security: max-age=15552000; includeSubDomains; preload
CF-Cache-Status: DYNAMIC
cf-request-id: 03f704ed6100000271a00b4200000001
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
X-Content-Type-Options: nosniff
Server: cloudflare
CF-RAY: 5b383dc23f0f0271-SJC
Content-Encoding: gzip

w.techdirt.com Meta Info

content="text/html; charset=utf-8" http-equiv="Content-Type"/
content="width=device-width, initial-scale=1.0" name="viewport"/

209.41.68.26 Domains

Domain WebSite Title

w.techdirt.com Similar Website

Domain WebSite Title
w.techdirt.comTechdirt.
techdirt.comTechdirt - Wikipedia
deals.techdirt.comTechdirt Deals
rtb.techdirt.comTechdirt Insider Shop | Giving you a reason to buy

w.techdirt.com Traffic Sources Chart

w.techdirt.com Alexa Rank History Chart

w.techdirt.com aleax

w.techdirt.com Html To Plain Text

Sign In Register Preferences Techdirt Techdirt Greenhouse Tech & COVID Free Speech Deals Jobs Support Techdirt See upcoming posts, with the Techdirt Crystal Ball... Techdirt Crystal Ball The Watercooler Behind The Curtain We see a Techdirt Insider membership in your future... Content Moderation Case Study: Dealing With Misinformation In Search (2004) Culture from the misinformation-goes-way-back-online dept Wed, Jul 15th 2020 3:49pm — Mike Masnick This series of case studies is published in partnership with the Trust & Safety Foundation to examine the difficult choices and tradeoffs involved in content moderation. Learn more » Summary: Google’s biggest early innovation in search was that it used inbound links as a tool for determining the popularity of a website, and thus what its relevance to a particular search might be. That feature, however, created some side effects that raised concerns about how search results might lead to misinformation, or how the search engine might be gamed. One of the earliest examples of this was the discovery in 2004 that the first result of a search on the word “jew” pointed to a blatantly anti-semitic website, Jewwatch. It was widely theorized that the reason for this was that the singular noun “jew” was more likely to be used by those pushing anti-semitic arguments, rather than the more common adjective “jewish” or the phrase “jewish wo/man” etc. Also, the site Jewwatch had been in existence for many years, and had many inbound links from other sources. Some also believed that the people behind Jewwatch had used an early search engine optimization technique known as “Googlebombing” to purposefully game the results — deliberately linking to Jewwatch from other sites, and using the word “jew” as the link text. As this result got attention, Google came under tremendous pressure to change the search result, as people accused the company of anti-semitism or deliberately pointing to the Jewwatch site in search results. The Anti-Defamation League sent a letter to Google asking it to explore whether or not its ranking system needed to be changed (though the ADL also posted an article to its own site telling people that it was clear that the result was not intentional, or done for nefarious reasons). Some politicians, including Senator Chuck Schumer, also got involved to pressure Google to change its results. Decisions to be made by Google: Should the top search results be manually changed when it is discovered they lead to misinformation and hate? Should the algorithm be changed to try to avoid these results? Should the company do nothing and say that the algorithm decides the results, period? Should any decision set a precedent for future decisions, and if so, what policies and guidelines need to be put in place to deal with future cases? Are there other ways to respond to this situation? How should Google handle attempts to game search via things like Googlebombing? Questions and policy implications to consider: If any changes are made, will lots of others expect similar changes to be made as well? Will making changes lead to questions regarding the credibility of search results and the Google algorithm? What sorts of policies and processes need to be in place to deal with these kinds of requests? Will any changes have other, unintended consequences as well? Are search engine optimization techniques nefarious? Can they be? If so, how do you distinguish between good intentions and bad intentions? If you block certain techniques, such as Googlebombing, will that stop the practice when used for good purposes as well? Resolution: Google responded by clearly stating that it had no direct intentions to change its algorithm. However, it did decide to provide more information, by using the advertising space above the top result to encourage people to click through for more information about how the results came about: The company also stated that it would “explore additional ways of addressing” issues like this “in the future.” Perhaps more interesting, however, was that Google’s users took matters into their own hands, and realized that if Jewwatch was Googlebombing, they could use the same tools to diminish the result. A campaign was quickly organized online, with many people linking the word “jew” to Wikipedia’s page on Judaism, and indeed, this worked to get that result to the top of the rankings. Over time, Google’s algorithms were adjusted globally to try to diminish the power of Googlebombing for any reason (good or bad). In 2007, the company announced that it believed its algorithm would filter out attempts at Googlebombing. In that discussion, the employees who helped stop the effectiveness of Googlebombing explained why they did so, and how they believed it was better to take a holistic approach (which was more scalable) than responding to individual “bad” results: People have asked about how we feel about Googlebombs, and we have talked about them in the past. Because these pranks are normally for phrases that are well off the beaten path, they haven't been a very high priority for us. But over time, we've seen more people assume that they are Google's opinion, or that Google has hand-coded the results for these Googlebombed queries. That's not true, and it seemed like it was worth trying to correct that misperception. So a few of us who work here got together and came up with an algorithm that minimizes the impact of many Googlebombs. The next natural question to ask is "Why doesn't Google just edit these search results by hand?" To answer that, you need to know a little bit about how Google works. When we're faced with a bad search result or a relevance problem, our first instinct is to look for an automatic way to solve the problem instead of trying to fix a particular search by hand. Algorithms are great because they scale well: computers can process lots of data very fast, and robust algorithms often work well in many different languages. That's what we did in this case, and the extra effort to find a good algorithm helps detect Googlebombs in many different languages. We wouldn't claim that this change handles every prank that someone has attempted. But if you are aware of other potential Googlebombs, we are happy to hear feedback in our Google Web Search Help Group. Filed Under: case study , content moderation , misinformation , search Companies: google 4 Comments Read More That's A Wrap On Techdirt Greenhouse, Privacy Edition Privacy from the so-much-to-talk-about dept Wed, Jul 15th 2020 1:39pm — Karl Bode The inaugural edition of the Techdirt Tech Policy Greenhouse is in the books, and we'd like to thank all of our contributors and those that engaged in conversation as we tackled one of the thornier issues of the modern tech policy era. As we noted early on , our goal with the project is to bring some nuance, collaboration, and understanding to a privacy conversation frequently dominated by simplistic partisan bickering, bad faith arguments, and the kind of deep ideological ruts that routinely result in either bad policy,or, in the case of U.S. privacy, no policy at all. If you've not yet had a chance to dig through contributions for this inaugural edition, here's a brief rundown: Senator Ron Wyden argued that it's time for Congress to finally pass a meaningful privacy law for the internet era. One with an eye on transparency, end user control, and meaningful penalties for incompetent or malicious corporations. Lindsey Barrett discussed the fixation on "big tech" when talking about privacy , and how this has allowed certain actors (predominantly in the adtech and telecom sectors) to tap dance over, around, and under meaningful scrutiny for the same or worse behavior. Evan Engstrom discussed whether we can craft a meaningful privacy law in the United States without ushering forth a new breed of privacy troll . Gigi Sohn and Jeff Gary explored how telecom industry lawyers in Maine are myopically engaged...

w.techdirt.com Whois

"domain_name": "TECHDIRT.COM", "registrar": "TUCOWS, INC.", "whois_server": "whois.tucows.com", "referral_url": null, "updated_date": [ "2020-01-16 02:20:04", "2020-01-16T02:20:04" ], "creation_date": [ "1998-02-16 05:00:00", "1998-02-16T05:00:00" ], "expiration_date": [ "2021-02-15 05:00:00", "2021-02-15T05:00:00" ], "name_servers": [ "JIM.NS.CLOUDFLARE.COM", "KATE.NS.CLOUDFLARE.COM", "jim.ns.cloudflare.com", "kate.ns.cloudflare.com" ], "status": [ "clientTransferProhibited https://icann.org/epp#clientTransferProhibited", "clientUpdateProhibited https://icann.org/epp#clientUpdateProhibited" ], "emails": [ "domainabuse@tucows.com", "support@domainrocket.com" ], "dnssec": "signedDelegation", "name": "REDACTED FOR PRIVACY", "org": "REDACTED FOR PRIVACY", "address": "REDACTED FOR PRIVACY", "city": "REDACTED FOR PRIVACY", "state": "CA", "zipcode": "REDACTED FOR PRIVACY", "country": "US"