- Joined
- Jan 20, 2008
- Messages
- 1,724
- Points
- 0
Today I was upgrading a computer for someone, just changing the mobo, cpu and ram, and they didn't feel like shelling out an extra $50 for a new power supply to go along with it. Their original psu was a 350W, which I wasn't sure if it'd be enough... So I built it, fired it up, and everything was going swell until I plugged in another harddrive to clone their data in case the repair install went south. With two harddrives plugged in it would only post maybe one out of every two times I tried to boot it... I wasn't sure if this was caused by lack of power, so I figured I'd calculate exactly how many watts it was drawing... I cut open a spare power cord, cut one of the hot wires, and attached my multimeter between them. I set my dmm to AC amperage, though I couldn't get it to power up... My multimeter is rated up to 10A, and I've used the amperage function before just fine, so I'm pretty sure the fuse isn't blown, yet for some reason it just wouldn't start up.
SO... I was wondering if there's an easier way of figuring out the total power draw of a computer, like say software based.. Either that or can anyone point out something I was doing wrong in trying to measure the amperage?
SO... I was wondering if there's an easier way of figuring out the total power draw of a computer, like say software based.. Either that or can anyone point out something I was doing wrong in trying to measure the amperage?