Ok, this is a few years late, but ...
APL, as the term is used by a video engineer, stands for average picture level, and refers to the mean of the Y' component (luma) component across the image area of a frame (or sequence, or movie, or whatever).
Average luminance - more correctly, average relative luminance - is the mean of the true CIE linear-light luminance across the image area of a frame or sequence, relative to reference white luminance.
The math that relates the two isn't simple, and it can't be done at all absent all three components R', G', and B' or Y', CB, and CR . Apart from the special case where R', G', and B' are all equal to either zero or one - notably, 100% colour bars - APL and average luminance differ, and sometimes dramatically.
At a display, video R', G', and B' signals (for our purposes, scaled to 0=reference black and 1=reference white) are each raised to a power of between 2.0 and 2.5 to yield linear-light RGB components that are directly related to luminance. That act maps 0.5 on the video scale to about 0.18 in relative luminance.
So: Don't mix up the two!
The APL of broadcast video is very roughly 50%; its average luminance is very roughly 18%.
The average luminance of a movie as presented in the cinema - and not necessarily a dark movie - is roughly 10%. I have this on extremely good authority, from a studio/DI guy who has access to DCI movie data and ran some perl scripts. (Sorry, I know that's gross, but he's an old awk/sed/grep kind of guy.)
The average luminance of a movie as transferred to DVD or Blu-ray is up for discussion, but I'd make a guess somewhere between 10% and 18%, say 14%.