Written November 2018
Beginning with A-Shell 6.5.1651 of November 2018, the fontsize-char parameter is available to allow the precise setting of font sizes.
When fontsize-char is true, the point size in a SETFONT statement is taken to be relative to the size of the character symbols in that font, rather than the size of the character cells. The difference between the two is that the character cell includes one or more rows of empty pixels for the internal leading, resulting in slightly smaller characters and larger overall vertical spacing for a given point size and external leading.
Most word processing programs calculate fonts based on the character size (fontsize-char = true) rather than the cell size (fontsize-char =false), so if you are trying match the fonts you would otherwise get from the output of a word processor (or XTEXT) then you should set this option to true.
Note that in prior versions of A-Shell, there was—and still is—an alternative way to get the same effect: set the point size to a negative value. For example:
//; 12 point Arial based on the cell size (default)
//SETFONT,120,Arial
//; 12 point Arial based on the character size (negative point size)
//SETFONT,-120,Arial
//; Set option to base font size on character size...
//SETOPTION,FONTSIZE-CHAR
//; 12 point Arial based on the character size (FONTSIZE-CHAR true)
//SETFONT,120,Arial
Comments
As with all other SETOPTION values, the initial/default value established at the start of each print request is false, but within the context of any given SETOPTION directive, the default value is true—i.e. you don't need to specify the ,true parameter.
See Also
• SETVMI
History
2018 November, A-Shell 6.5.1651: Parameter added to SETOPTION