I have a Dell Optiplex 960 with a Core 2 Duo and a Radeon HD 3450 hooked up that's being used as a way to connect to my file server over my LAN so we can watch our movies, which range from 480p to 1080p, 720p included. This Optiplex just replaced a Dell Dimension with a Pentium 4 and another Radeon GPU. The problem that i'm noticing is that with any movie, no matter what the quality, there are graphic artifacts which appear at random points throughout the movie, some that take up little parts of the screen, or even covering the whole until the camera angle changes. I updated the GPU driver, but that didn't fix it.
The reason why i switched to the Optiplex was because the Dimension couldn't handle 720p-1080p very well, every so often the movie would stop and have to buffer, and/or it skipped. With the Opit, the movie will play no problem, but these artifacts are killing me, especially with high action movies. I did try switching out the GPU with other ones i had, a Radeon x1350 and a Radeon x1500, but they all did the same, or just couldn't play the movies at all without skipping.
So my question is, what could be causing these artifacts? I know my LAN connect isn't the problem since all the computers in the house have gigabit NICs, even though the router is only 10/100, but the transfer speeds over the wire have peaked at around 100 Mb/s, so i know that couldn't be it. Could this be the computer itself? Thanks
The reason why i switched to the Optiplex was because the Dimension couldn't handle 720p-1080p very well, every so often the movie would stop and have to buffer, and/or it skipped. With the Opit, the movie will play no problem, but these artifacts are killing me, especially with high action movies. I did try switching out the GPU with other ones i had, a Radeon x1350 and a Radeon x1500, but they all did the same, or just couldn't play the movies at all without skipping.
So my question is, what could be causing these artifacts? I know my LAN connect isn't the problem since all the computers in the house have gigabit NICs, even though the router is only 10/100, but the transfer speeds over the wire have peaked at around 100 Mb/s, so i know that couldn't be it. Could this be the computer itself? Thanks