30 comments

  1. Peter Jindra says:

    It didnt recognize those shoes even though a decent percentage of the shoes look like that? Sounds like you just have more work to do in improving the AI, since “biased” humans can easily recognize all of those shoes.

  2. The Whirling Dervish says:

    Google lecturing about bias when its search algorithms and news feeds are wishy washy lefty biased…so we can imagine the marxist AI that google is developing

  3. Mike Lawrence says:

    Filtering results is EVIL.
    A hand-crafted filter doesn’t reflect truth, it reflects the hand that crafted it.

  4. Amit Sarda says:

    Never thought about machine learning and human bias. Always thought it will not affect the results. But we are designed to see the world from our own eyes, experiences.
    Why will our code be any different.

  5. Mike Lawrence says:

    Here is what happens when Google “tunes”, aka filters, search results:
    Google “star trek white genocide” and count how many pages you have to click through to find an original article claiming there is “white genocide” in the new “star trek” behind-the-CBS-paywall series. The search results are pages and pages of one-sided SJW virtue-signal propaganda. Shouldn’t a fair result include a one-sided result presenting the source of the contraversey? Before you call me a racest for pointing this out, you should know my wife is dark skinned and I’ve volunteered for 7 years helping inner-city poor in Atlanta GA.

    “Unbiased” learning algorithms tuned to protect us from any facts or opionions that are not politically correct, blind us from reality and prevent us from making an informed opinion.

  6. The Legend34 says:

    This is the most backwards thinking, Google what is happening to you latley….. Do you even think anymore or let the machine learning come up with this stuff. You obviously have to program it biasedly to not be biased lmao

  7. Allan Bartlett says:

    Your counter methods also have bias. Also has anyone tried Carnap’s structuralism? Definitions in terms of relations?

  8. Vandee Digitalis says:

    Yes human biases do become part of the technology we create especially those of us who rely on machine learning. But what about those who created the IBM code 0011001010 computer programs there are definitely biases from those programmers and who made the big decisions on how those programs outcomes for certain people. Technology biases I know it exists as computers are trying to “balance” out discrimination as it, the computers sees fit?VD

  9. Vandee Digitalis says:

    1:51 / 2:33 selection bias. Why are computers sending me fake computerized everything almost emails, page views, spam and often real computerized telephone calls. Because the programs are faulty and disregard anything that is truthful as to the agenda of the tech giants.Equality????? VD

  10. Vandee Digitalis says:

    There is a magic bullet and it is weeding out hose who figured out the U.S. Gove, Technology and the truth of surveillance. It does not exist, surveillance except for a very few those who figured it out and those who are not with the Gov-and technology but understand it. Bullets kill people even in the Matrix mind.VD

  11. Vandee Digitalis says:

    The new programs are to benefit the computer nerds so now everything is reversed for technology even those that use and rely on technology loose.VD

  12. Louis vd Walt says:

    The only problem with that is that you are always forced to pick one answer where several might be applicable… Why can’t I draw two shoes? Why is the answer either this or that, never ‘select all that apply’? Do you build bias into your software on purpose?

  13. Gabriel Parrondo says:

    “There is no magic bullet”? “There is no magic solution” or “There is no silver bullet”. Pick one! :p

  14. MintSodaPop says:

    Interesting. Apparently anything that encourages inclusivity and impartiality is “leftist propaganda” and a “political statement.” lol

  15. Dat Boi says:

    All of these are dataset biases.
    And in the physics case, if you’re trying to classify physicists from images, then yes, it should have a bias towards males if males outnumber females.
    There is nothing wrong with biases if the biases are based in reality.

  16. SammyFloydWals says:

    So we should get rid of our human influences by influencing it? Makes no sense to filter the search results

Comments are closed.