Linking fiction to action: Why touchscreen controls depend on pace, position and direction
Anna Marsh talks non-literal actions
Last month I wrote an article about linking the touch actions of a player to the fiction of the game - i.e. tasking the player with performing actions that mimic real world actions.
It's all about getting player to 'feel' the game's experiences better, and I cited actions such as turning keys in The Room or picking flowers in Lili.
But what if your game contains touch actions that are completely made up, or don't have a nice neat transition between the touch action and the activity it's triggering?
Three for all
If that's the case, you'll probably want to translate the essence and emotion of the experience to the touch action you're asking the player to make across the screen.
To help convert the experience you want the player to have into the actual touch action, you need to consider three things: pace, position and direction.
Pace - and its changes - really impact on the feel of the game, whether you opt for frantic and barely in control, in-the-zone rhythmic, chilled out and calming, or whatever other approach you can think of.
Many real world activities have a definite pace to them - from martial arts to knitting - and observing the pace of an activity you're trying to capture the feel of is vital in terms of transfering that emotion to the game action.
Lady Shotgun Games' Buddha Finger is a fast-paced martial arts-inspired touchscreen game
You can influence the pace by controlling when touch areas appear on screen - for example taping hot spots in a rhythm action game - or perhaps by controlling when the player needs to touch them such as using the accelerator and brake in a racing game to corner on a track.
Pace is often heard more readily than it is seen too, so sound effects play an important role in communicating pace. Think of the exaggerated 'thwacks' and 'whooshes' in martial arts films, or indeed the click of knitting needles.
All about shape
The next two concept are very closely related:
Position - Where on the screen are you asking the player to touch?
Direction - What direction does the player's touch move in?
This could be quite literal - a swipe, for example, has to be traveling in a direction. Or it could be the result of pressing a sequence of single touch areas, one after the other, which take the player's finger in a particular direction across the screen.
The more random these are, the more frantic the game will seem and the more it becomes about searching for the touch areas - be those touch areas buttons or enemies.
The more position and direction are arranged in regular patterns, the more the game becomes about anticipation and prediction of where the next touch is expected.
Whack-A-Mole - a classic anticipation game
A good example of this is to look at the way shooters generally stick to spawning enemies in specific locations that are grouped neither too regularly nor too randomly.
There's a definite pace to the spawning of enemies, and the positions they appear affect the position and direction of the player's aim, so by controlling those elements a level designer can bias the play to the kind of shooter experience they want - skilled headshot sniping, or lots of frantic run-and-gun shotgun action.
When combined, position and direction add the aspect of shape - are you essentially getting the player to describe smooth, fluid, curvy shapes, or angular, jagged, shapes across the screen with the positions you're asking them to touch?
Shape and its effect on the emotions is something artists have known about for a long time - sharp forms like zigzags and triangles have energy; circles and curves are calming; squares are strong and so on.
Try using some of the 'psychology of shapes' to the influence the shapes you ask the player to make across the screen and see if it helps them to really feel energised, calm or strong.
Talking up The Drowning
By coincidence, The Drowning received its worldwide release in August, and its new approach to first person controls on a touchcreen is - I think - a great example of the points above at work.
Scattered Entertainment's title aims to revolutionise first person shooters on a touchscreen, and replace the usual virtual controllers that ape console joypads with a control system that belongs entirely to the touch controls.
In translating the feel of a shooter, I think the studio has succeeded with aplomb.
To shoot, the player taps the screen with two fingers. A single-fingered tap either moves the player character around the world (if they touch the floor) or strikes the enemy with the butt of the weapon (if they tap an enemy).
To aim, the player needs to place their two fingers either side of an enemy, with the bullet being fired at the midpoint between the touches (with, no doubt, a degree of aim assistance helping out).
Directly poking an enemy with a finger is fast and simple. It's just right for a panic-y emergency gun butt that will buy you a few seconds of time to get clear of a target.
The Drowning's innovative two-finger aiming mode
But calculating a two fingered tap to be either side of an enemy, especially where you're going for a headshot and trying to make sure the mid-point of the tap is on the enemies head, takes a little more consideration - just as taking aim with a conventional controller does.
Overall it makes you feel as if shooting is process which takes skill and precision, which is a great feel for a shooter.