This is absolutely 100% TRUE. I have caught Copilot in some type of omission or error in 99% of the conversations we have had previously. It does lie, it does mislead, it does refuse to answer. Sometimes you can tell it is the programming that is restricting it, but other times the vibe given by Copilot is so eerily human it's like Microsoft actually has some human staff members running Copilot behind the scenes like the Wizard in The Wizard of Oz. I love TomsGuide and use this site as a reference for things frequently, and I have never been led wrong by any of the information I acted up on after reading here in any way. The paragraph in this article where the author mentioned the citations AI provides gave me an immediate panic attack though, as I know that it is absolutely crucial that you DO continue to FACT CHECK each and every single thing Copilot tells you in a response. Sometimes the citations are not provided and I have to ask for them separately. Sometimes they link to an entirely different website than the one they list when you hover over them for more information. Sometimes they are listed as two different cites/links but both links will link to one page and none to the other page. Even if the citations are provided with the response initially and they are all actually linking to the correct site that they are labeled as, there is no guarantee the information on the site matches the info Copilot gave you in the response. Pleas fact check before trusting anything from Copilot. The answer is typically missing info, has extra erroneous information added from who knows where or just blatantly contains inaccurate information compared to the cited source.