Find screen resolution from Mac OS X command line.
Today I found myself wondering if it was possible to find the current screen resolution from the command line (and hence from a shell script) in Mac OS X. My first thought was to try the same method I use in Linux which is extract it from the output of xdpyinfo. That turned out to be no use though. Firstly running xdpyinfo causes X11.app to launch, which is messy. Second, I rather suspect that if X11.app isn't installed then neither is xdpyinfo and X11 is an optional install pre-Leopard. Finally, it gives me the wrong value for the vertical resolution. My screen resolution is 16080x1050 but xdpyinfo says:
case:~ mike$ xdpyinfo | grep dimensions
dimensions: 1680x1028 pixels (445x272 millimeters)
It's probably incorrect to say that it's giving the wrong value for the vertical resolution. More likely I expect is that it's the correct value for what it's measuring but what it's measuring is not what I want to measure. I find myself wondering if the Mac OS X Menu Bar (MenuBar? Menubar?) is 22 pixels high but I don't feel inclined to check.
I couldn't find any other method of getting the screen resolution via Google so the method I came up with to get the horizontal resolution is:
$ defaults read ~/Library/Preferences/ByHost/com.apple.windowserver.$(/sbin/ifconfig en0 | grep ether | cut -d " " -f 2 | sed 's/://g') | grep " Width =" | cut -d "=" -f 2 | sed 's/[ ;]//g'
To get vertical resolution replace 'Width' with 'Height'. It's rather long winded, but it works. I don't like the grep cut sed stuff at the end but I couldn't find a way to get the defaults command to directly read the relevant value.
2 comments by 2 or more people
John Rawnsley
osascript -e ‘tell application “Finder” to get bounds of window of desktop’
returns an array 0, 0, width, height and seems to be the full window bounds including the menu bar (which is 22px high). I haven’t tried it on a multiple monitor setup.
28 Dec 2009, 21:40
Mike Willis
Interesting. Using AppleScript to get the information wouldn’t have occurred to me although I have dabbled with it in the past. Although the AppleScript method is neater, from prefixing the commands with ‘time’ is appears to be slower than using defaults and piping the output through other commands. Repeatedly running the command I devised gives:
real 0m0.041s
real 0m0.040s
real 0m0.042s
real 0m0.041s
Whilst the AppleScript method takes roughly three times as long:
real 0m0.128s
real 0m0.128s
real 0m0.133s
real 0m0.126s
Though of course such a small absolute difference in execution times isn’t noticeable to the casual observer.
02 Jan 2010, 11:48
Add a comment
You are not allowed to comment on this entry as it has restricted commenting permissions.