You can install servers on your own hardware. Even low-end machines can be
more than enough. It depends on the server and on the number of clients
simultaneously connecting to your server (e.g., with a Web server, how many
visitors at the same time). If the machine is, at the same time, used as a
desktop, some of the desktop applications may raise problems: memory leaks,
crashes, etc., whereas stability is a must when server software is developed.
And, as root_vegetable wrote, you cannot switch the machine off if you want
your server to always be available.
The most ecological solution generally is sharing one dedicated server
between many users. Each of them can administrate her own virtual machine. By
sharing, the load is more constant: the hardware does not spend as much
electricity being idle. The hardware in question usually is server-specific
too (no video card consuming energy for nothing).
If professionals take care of the remote hardware, you do not have to (pieces
to change, availability of the server, etc.). Also, you usually have the
possibility to immediately get more resources (CPU, RAM, storage) by paying
more. Pre-configured servers usually are an option too (configuring a mail
server is not easy...). Now, the obvious drawback of that solution is that
you do not control the hardware, only the software (assuming that you are
"root" on your dedicated server and that you only use free software).