mosfet testing

Two laboratory power supplies, a multimeter and a resistor of a few hundred ohms could do the job, with some kind of socket or alligator clips to connect the MOSFET.

Assuming it's an enhancement MOSFET (is it?):
Connect one lab supply between gate and source, with the resistor in series with the gate and mounted close to the device under test. Connect the second lab supply between drain and source.

Set the first supply to 10 V DC or whatever may be the gate-source voltage at which the on resistance is specified. Set the second supply in constant current mode, with a current setting small enough not to cause excessive self heating. Switch the second supply on, measure the drain-source voltage and divide it by the current to determine the on resistance.

Switch off the second supply, reduce the first to zero gate-source voltage. Set the second supply to a 1 mA current limit or whatever off current is specified at the maximum VDS (if it doesn't go that low, as low as you can) and to the specified drain-source voltage. Turn it on and check if it goes into current limiting. If it doesn't and the MOSFET doesn't explode or produce smoke, the MOSFET meets the VDS spec.
 
Marcel's method is the way to go, but I would add one or two details.

Before and during the establishment of the test setup, keep the gate and source shorted: this will avoid subjecting the gate to damaging voltages if you are not careful about the order in which you make the connections.
For example, if the drain-source supply is earthed and you connect it first, and the gate-source supply is class II and you make the gate connection first, you will apply a 50Hz voltage caused by the leakage between the gate and the source; this voltage will be anywhere between a few tens of volts and more than a hundred volt.
The current will be at most ~300µA, but this doesn't matter for the gate, if it is unprotected.
This is just an example: there are other situations/connection orders susceptible of causing problems, but if you keep the gate shorted until everything is firmly connected, you do not have to worry about it.
When the test is finished, switch off the supplies and short the MOS again until everything is disconnected.

In reality, since your MOS is a big one, with almost 10nF of input capacitance, the chances for damage are minuscule, but if another member read this thread and tests 2N7000 or BS170 in the same way, the chances for damage are very real.

Regarding the max Vds, this MOS is avalanche-rated, meaning you can test it to its real breakdown voltage without risking to damage it: use a 200V supply and a 100K limiting resistor (do not rely on the supply's limitation for that), and measure the voltage; it should be typically ~20% higher than the specified abs max rating (keep the G and S well shorted during this test too)
 
rds on measurement

Set the first supply to 10 V DC or whatever may be the gate-source voltage at which the on resistance is specified. Set the second supply in constant current mode, with a current setting small enough not to cause excessive self heating. Switch the second supply on, measure the drain-source voltage and divide it by the current to determine the on resistance.


instead of a supply from D to S, is it feasible to measure directly with a milliohm meter?