How to set up 2,3 or more monitors
If you’re used to multi-screen setup then the information here on USB graphics adapters, and docks (e.g. from Plugable https://plugable.com/collections/usb-type-a-graphics-adapters & https://plugable.com/collections/docking-stations/thunderbolt-to-host) is what could be new to you - and if you own an Android phone then the DisplayLink Presenter app along with a DisplayLink USB graphics adapter or dock will allow you to output your Android screen (besides DRM protected streaming content) to a large monitor - and via USB ports on monitor or dock you can have keyboard & mouse connected too. Otherwise this article assumes an audience of people who have only used a single screen when left to their own devices.
The Too Long; Didn’t Read - especially for those with an existing screen (e.g. an HDMI equipped TV):
-
identify your computer’s display connections and identify the screen’s connections (see specifications in their manuals),
-
if you do not have a match between the screen connections and the computer connections then an adapter can usually sort it (including a USB to HDMI/DVI adapter if computer has only VGA output and screen has no VGA input) - and DisplayLink USB products allow, via an app for you to at least mirror your screen to a larger monitor,
-
finally make sure you can have the top of the screen at eye level to avoid neck strain (https://www.hse.gov.uk/msd/dse/good-posture.htm ) - consider a VESA mounting solution if needs be (e.g. height adjustable VESA stand, VESA arm).
This “Video: Connecting a monitor” for Windows 10 here might also be useful 5 minute summary.
Getting the most out of a 2nd monitor - window management:
Using the screens to their full potential involves arranging windows - i.e. window management. So summary of the ways of doing that - whether one already has a multi-screen setup or does one using either the TL;DR version above or the Long Read version below.
Windows:
dragging and drop a window by its title bar: to the top of the screen will maximise it,
to edge of screen will make it snap to half the screen width
to a corner makes window become a quarter of the screen.
Windows Key + Left/Right arrow key will snap window left or right
Windows Key + Shift + Left/Right arrow key will send window to left or right monitor in a multi-monitor setup
Windows Key + Up/Down arrow key will make current window cycle through minimised, windowed and maximised (or vice versa).
Linux:
It can vary with your window manager - but current Gnome versions (e.g. as in Fedora Linux) use the same keyboard shortcuts as Windows - and drag and drop works except for the snapping to one quarter via dragging to a corner.
Mac:
Is not good for inbuilt shortcuts for arranging windows - see official docs . To get the kind of window management built into Windows or Linux one has to use 3rd party solutions - Magnet is a well-known & full featured one I think.
The Long Read:
Why use a computer with multiple screens?
Mostly it’s for not having to switch windows - writing a document while having a web browser open to one side for research, being on a video call in one window while having a document open in another, checking two spreadsheets side by side etc. Apart from IT people the people I associate most with multi-screen setups are financial traders with info on multiple markets or stocks open at once.
Step 1: Identify the display connections on your computer - if you don’t already know
Identify your computer model and the manual in the specifications section should usually list the available connections. This photo on Wikipedia shows DVI, VGA, and HDMI connections - all reasonably common on monitors that are not the latest & greatest. Likewise if you have an existing screen, and are not sure what the connections are; the manual specifications section will name them (often as “Inputs”).
Step 2: Assess a suitable screen for your environment
Next is a whole load of considerations about using a screen: ergonomics, mounting, resolution etc. Even if you have a screen lined up to use it may be useful to skim through this - for reasons as important as avoiding neck strain and as trivial as understanding what more advanced screens might have.
Ergonomics
The first thing to remember is ergonomics - this page from the UK Health and Safety Executive shows what is considered good posture when using displays - particularly: having the top of the screens at your eye level when seated. This can mean that while one might think a large flatscreen TV (e.g. 32 inches or more) would be a fantastic screen - it is not if placed close to you on a table in front of you as the top is unlikely to be eye level!
How to get screens to eyelevel
To have the screens at eyelevel the low tech solution can be placing objects that can provide a stable base under your screen to raise it (reams of printer paper have served in the past) - while solutions designed to produce this include:
-
monitors with height adjustable stands,
-
laptop stands (when your laptop screen will be used), and
-
VESA mounting solutions.
Using a VESA mount
VESA® mount is an industry standard for the placement of holes for mounting - both on the backs of TVs and computer monitors and in the stands they affix to (Wikipedia page on VESA mount ). The most common for desktop monitors is a square of holes 100mm apart - VESA 100 colloquially. If using a larger screen such as a flatscreen TV then a VESA mounting with holes 200mm apart may be relevant. Not all monitors have holes for VESA mounts - those don’t generally: A) have a stand coming out the bottom of the monitor and B) don’t have a square of screw holes on the back (check monitor, its manual for mentions of VESA or photos online if considering a purchase).
Monitors that have stands affixed to the middle of the rear of the monitor will often have VESA mount holes that are accessible once the included stand is removed - even if the stand is removed via a quick-release mechanism rather than screws. Why consider a VESA mount if stands are included?
Monitor mounting solutions using VESA holes come in a great variety. Some are adjustable arms that clamp on to the edge of a desk and some are wall mounting arms - these might be useful where deskspace is limited. Others are designed for dual monitors or even quad monitor setups - so having a single stand on the desk can save space compared to using the builtin stand. You get the screen to eye level - what else in choosing a screen?
You may have specialist professional requirements - those working in photography or with print media may need to colour calibrate their monitors - that is beyond the scope of this article. Likewise medical imaging has specialist requirements. This is guide for general users though.
Resolution:
Modern screens - like digital images - are a grid of individual pixels (individual dots) on which the colour can be changed - and physical size for screens is specified as the diagonal measurement of the screen in inches and digital size as the width and height in pixels.
Resolutions to be memorised for reference:
-
HD (or Full HD) is 1920 pixels wide by 1080 pixels high - this is the standard for Bluray discs - and most new monitors can do at least this. I find seated at a desk a screen with a 22 to 27 inch diagonal with this resolution is a good size for a multi monitor setup - not too small, nor too large to take in the whole screen at a glance. A new HD monitor would cost £70 or more - features that differentiate will be connections, whether there is a USB hub included, whether it has VESA mount holes, whether the stand allows rotation or height adjustment.
-
4K UHD is 3840 pixels wide by 2160 pixels high - this is 4 times HD and will be the common standard for 4K screens and TVs.
Having screens that match the resolution of all other screens used is convenient - because when the screens are aligned (as at the 3 minute mark of “Video: Connecting a monitor” for Windows 10 ) then moving from one screen to another with the mouse encounters no obstructions.
What if there is a resolution mismatch? If one screen has a much higher resolution than another - and you use that high resolution - then moving to the low resolution screen with the mouse may require you to navigate as if moving across a room where the floor or ceiling suddenly is raised or lowered. I.e. you may have to move the mouse through a specific part of the edge of the larger screen to avoid an obstruction - similar to avoid bumping your head, or being tripped up, by a change in ceiling/floor level - at the point the floor/ceiling level changes in a room with split into areas with different floor/ceiling heights.
The other reason to pay attention to the resolution is a small possibility that a modern monitor e.g. a 4K one might have a higher resolution than your computers graphics can handle. But that is more likely with processors from before 2015 or Celeron/Pentium branded Intel processors (rather than Intel Core processors) - and these older/cheaper processors are less likely in a professional environment. Your computer manual, graphics card manual or specs for the processor graphics should tell you the maximum possible on your system. In terms of resources to check: for Intel processors the specs on [https://ark.intel.com}(https://ark.intel.com) show if the integrated graphics can handle 4K and theoretical number of screens supported (though processors with integrated graphics that can do three screens may be on motherboards with only two connections), for Macs: everymac.com has a 2nd Display Support category under Tech specs tab for each model, for AMD CPUs and integrated graphics: the documentation is much hard to navigate than for Intel (I think it might come down to seeing which DisplayPort standard the integrated graphics support - then seeing what the standard specifies- i.e. consulting multiple technical documents)!
Rotatable?
In monitor settings on Windows, Linux and Mac it is normal to be able to rotate the image sent to the screen - and one just needs to have a stand that permits rotation (or mount it in a permanently rotated position) to have a screen that is taller than it is wide. This is mostly a benefit for long documents or web pages. Ultrawide?
If you have several hundred pounds to spend then an ultrawide screen is an option (and might remove the need for a second monitor). However I would suggest that ultrawide monitors that are only HD (1920 x 1080 pixels) constitute poor value for money compared to 4K ones. Given that a full HD screen new cost less than £100 having two for £200 or much less seems better value than one pricey HD monitor stretched larger (using two monitors when window management supports autosizing windows to a half or a quarter of the screen also means two monitors increases the window management options compared to a single monitor).
Cables:
VGA, HDMI, DVI, and Displayport cables can all be bought in long lengths - which may assist if you want to connect to a TV some distance away or e.g. have a desktop computer on the floor but have a multi-monitor setup at eye level in a standing desk setup. However in terms of common cables: HDMI, as long as the cable supports HDMI version 1.4, can be used for a 4K monitor - so is the kind of cable one might easily find available, and in long lengths, that is usable for both work and entertainment purposes when connecting displays.
HDR?
Along with 4K the monitor may support HDR - High Dynamic Range. This allows a wider variation in colours to be represented (a sunset is good example of where one sees a great range in brightness and shadow and wants to see the detail in both) - and if the images displayed are appropriately captured and processed this may be a benefit. But it is specialist compared to everyday useage for general office work (for movies HDR may help).
Gloss or matte?
Computer screens are often matte - but modern TVs in particular often have somewhat glossy screens - which can make for distracting reflections. Seeing a photo of the screen switched off & seeing the clarity of reflections in it often gives enough of an idea of how glossy it is.
Power consumption:
At the level of individual monitors (as against a fleet of monitors) the differences in power consumption seem small compared to the environmental impact of a new monitor. But if you have a couple of choices then check the specs for power consumption. The only exception is if using very old (dozen years or more) and very large screens. Very old flatscreen monitors and TVs used fluorescent backlights before the advent of LED backlights - which could mean a 32 inch HD TV used 160 watts when a modern 4K TV can do 40 watts. Large old computer screens with flourescent backlights and USB hubs built in could use 40 or 60watt - when 31 inch Full HD screen can use less than 30 watt. However using a very old monitor for occasional use makes more environmental sense than getting a brand new monitor if new is not required.
HDCP - or do you watch consumer entertainment for/at work?
High-bandwith Digital Content Protection (HDCP) is a standard you need to play copy-protected content on your screen - which includes things like Blurays or high-definition streamed video from a movie studio (but not simply HD YouTube streams if the content is not a major movie/TV series etc). HD TVs will definitely have it as will any new monitor with a digital input such as DVI, DisplayPort or HDMI. Only analogue monitors (with only a VGA input) or very old monitors (e.g. pre-2008) will not have this - by 2008 the then current DisplayPort standard included HDCP for instance. If you are e.g. a culture critic then you will need HDCP but almost all monitors will have it anyway.
Know your graphics connections & selected a screen - now what?
If you don’t need any adapters (e.g. your computer and the screen both have HDMI connections) then connect the screen to the computer and power on. And the procedure will depend on whether you are on Windows, Mac or Linux.
On Windows:
the “Video: Connecting a monitor” for Windows 10 here is a decent summary. The main settings to adjust are in the Display Settings dialogue at the 3 minute mark in the video. To get the benefit of multiple monitors you want to extend rather than duplicate the desktop in those settings. By default each monitor should run at their maximum resolution - so in Display Settings it is a matter of ensuring the diagrammatic representation of the monitors matches the physical layout (a mismatch means for instance the monitor to the left physically is on the right diagramatically - so you have to move your mouse right to to go the left hand screen - a confusing setup and hence why one can re-arrange the screens logical positions, using drag and drop on a diagram, to match their physical positions of the screens - something the video demonstrates).
On Mac:
it is in System Settings > Displays where one makes adjustments.
https://support.apple.com/en-gb/guide/mac-help/-mh40768/mac - like in the video for Windows referenced above one can drag around rectangles representing the connected displays to rearrange them.
On Linux:
it will vary by system - but the concepts are similar - a settings dialogue where one can choose to mirror or extend displays, one can rotate a display, set resolution of a display and arrange the displays relative to each other.
Congratulations! If it is all set up now then check you know the keyboards shortcuts above in the window management section and enjoy.
I need an adaptor or even more displays?
If your graphics connections don’t match the the screen connections you have adaptor options - and a USB graphics adaptor can overcome a limit on your graphics card or computer motherboard on how many displays one can connect (though if 3 screens are not enough & one has a desktop one can get graphics cards with at least 4 ports).
However - apart from USB-C connections to ports with DisplayPort or Thunderbolt built-in (meaning the computer is outputting a display signal directly to a compatible display - as designed by the manufacturer) - it is best to avoid USB graphics adaptors as they have less responsive graphics (i.e. for handling video compared to fairly static content like office documents) and will put a load on the processor in a way that I believe adaptors for graphics based protocols (like DisplayPort etc.) and docking stations do not.
In terms of graphics protocol adaptors you can get:
-
DisplayPort to: DVI, HDMI, VGA, miniDisplayPort
-
miniDisplayPort to: DVI, HDMI, VGA, DisplayPort
-
DVI to: VGA, HDMI
-
USB-C (or Thunderbolt 3 - which uses USB-C connector) to: VGA, HDMI, DisplayPort
What you will not find (cheaply at least) is VGA to DVI or HDMI. That is because VGA is an older analogue signal standard and converting from analogue to digital (DVI and HDMI are digital) is unnecessarily complex when the computer graphics are digital to start with.
If you only have a VGA connection on your computer but want to connect to a digital screen - or you have a Mac laptop that can only have one external display then a USB graphics adapter or docking station is the best option. An established brand is Plugable - and the USB 2.0 options here would work with a laptop too old to have more than a VGA output and this, for example, would work on a modern Mac restricted to one external display via Thunderbolt. USB graphics adapters are commonly using DisplayLink chips - for which drivers can be found here https://www.synaptics.com/products/displaylink-graphics/downloads (in case the manufacturer is no longer offering current drivers).
A docking station may use a specialised adaptor on the computer (historically: often on the underside of the laptop leaving connectors at the side free) or use standardised ones like USB-A, USB-C or Thunderbolt (e.g. these). The advantage of a docking station over a simple adaptor is if you are limited in the number of ports on your computer (such as a laptop or a Mac laptop especially) then a docking station can both expand the number of graphics connections and other ports available (with things like Ethernet for wired internet and USB connections) - and for laptops or tablets it can be more convenient to connect one docking station connector when you return from travel to your desk - rather than reconnecting multiple displays, keyboard, mouse, wired Ethernet etc.
The one place where USB graphics adapters are brilliant & the best game in town is not on desktops or laptops - but on Android phones/tablets. There is a DisplayLink Presenter app (currently works with Android 5 and above) - with it one can use an Android phone with the display mirrored to a much larger external monitor if you have a DisplayLink USB graphics adapter or dock. This will not mirror some content, like streaming video, due to DRM restrictions. Docks have their own power supply typically but the USB graphics adaptors need power - without which they may fail to work or run down mobile device battery - and there are two ways to deal with that:
- A USB Y cable (with an auxiliary USB connector to plug into a USB power supply) can deal with that easily if the USB cable of the adapter is removable and changeable.
- If you connect the phone to a USB hub (one might be built in to your monitor - look for USB ports) - then the hub power supply will power the USB graphics adaptor not the mobile device battery (if hub is in monitor then no extra power supply needed). Going via a USB hub (in monitor or not) or a dock also offers the advantage of another port to plug in keyboard/mouse or dongle for wireless versions (if you don’t use Bluetooth for these).
Finally: cybersecurity and secondhand monitors
Things are cheaper secondhand - are there cybersecurity implications of buying secondhand monitors?
The first thing to say is that if working for large orgs or with information classed as sensitive data under data protection legislation you would probably go with new from trusted suppliers anyway. But if you are a cash strapped small business or charity the takeaway is: check with your insurers/own experts, don’t use secondhand smart TVs as they’re an obvious risk, but, without accepting liability, I would not rate a used monitor as a likely security risk if you only connect via DisplayPort, DVI, or VGA (if your work involves external presentations then connecting to someone else’s projector/conference room flatscreen already involves this kind of risk).
While noting that for most professional settings one would simply go for new items from a trusted supplier I do think there are meaningful distinctions to be drawn if considering used displays. Implicit in the following is the understanding you buy from a large established reseller of business IT equipment & nobody is trying to target your business specifically by e.g. opening and modifying the hardware. That implicit set of assumptions put focus on what can be modified to become harmful in the ordinary course of things - i.e. the chances of a display becoming harmful through connection to a hacked computer. With those assumptions the focus is not on: if attacker knows what you are buying and intercepts it & modifies it specifically to target you. So your threat model does not include the hardware reseller or state level agencies - threat is random malware found generally. And the threat model is not about whether electronic signals intelligence would be able to see what is on the monitor (i.e. https://en.wikipedia.org/wiki/Tempest_(codename) ) - which needs needs secure design & installation - rather than a new product.
The first distinction is to separate smart TVs from all other kinds of displays. A smart TV typically has an embedded computer (often running Android or a custom OS based on Linux) and often wireless connections - and being produced for the consumer market may not have much in the way of updates. So do you want to allow old, out of date, potentially hacked computers through the door? No. That does not mean: using one you bought new & never connected to the internet could never be OK - but it does mean secondhand TVs with unknown histories should be excluded as an option.
Secondly: Thunderbolt and HDMI (which both allow Ethernet) are more complex protocols - and so the possibility of communicating over them and hacking the devices on each end is greater (e.g. Thunderbolt attacks are known to be possible with a custom made malicious peripheral ). There is a DEFCON presentation here on hacking HDMI-CEC (Consumer Electronics Control - a protocol for communicating and controlling connected media devices)
Thirdly: USB is vulnerable - not only in terms of the usual “infected files on a USB stick” but if the USB firmware of the device itself is rewritten then the lack of authentication in the standard can allow attacks like BadUSB ( https://en.wikipedia.org/wiki/BadUSB ). There are ways to update USB device firmware (e.g. see these Microsoft docs ) so unfortunately I can’t exclude sufficiently sophisticated malware using a USB device as a vector - even most folks would not class it as an obvious threat like a USB stick.
Fourthly: searching for DisplayPort in the NIST National Vulnerability Database turns up just 3 results - at time of writing. For VGA the equivalent results often concern vulnerabilities in the code of virtual graphics cards emulating VGA connections in virtual machines and for DVI the mentions concerns device independent files rather than the video connection standard.
In conclusion: I don’t have depth in this area & not accepting any liability - so for better advice consult your own experts/insurance company who would be covering you in the event of cybersecurity related business interruption - but avoiding used smart TVs, HDMI devices, USB graphics adapters and any USB hubs built into the monitor, in favour of connecting to monitors via DVI or DisplayPort connections (or even VGA) would likely reduce the attack surface and number of likely attacks for used IT monitors acquired via a reputable reseller. More succintly: having your network hacked because you added an out of date 2nd hand consumer smart TV to it falls under “100% predictable” as a possibility - but a used DisplayPort monitor being an attack vector seems much more exotic.
Trademark footnote (not sure this is required for nominative fair use and any omission is not intended to deny something is a trademark but a good illustration of the proliferation of trademarks to describe all this technology): Windows is a trademark of the Microsoft group of companies. Linux is a trademark of Linus Torvalds. Mac® is a trademark of Apple Inc. VESA® and DisplayPort ™ are trademarks owned by the Video Electronics Standards Association (VESA®) in the United States and other countries. YouTube™ is a trademark of Google LLC. The terms HDMI and HDMI High-Definition Multimedia Interface are trademarks or registered trademarks of HDMI Licensing Administrator, Inc. Celeron, Intel, Intel Core, Pentium, and Thunderbolt are trademarks of Intel Corporation or its subsidiaries.