The key is creating a model for the queue length. The easiest approach is to set a probability value x and say “a person appears at the bank when the probability is greater or equal to x”. Then the probability drops to 0 and increases over time until it hits x again.
Another possibility would be to generate a random number every second and compare it to the current probability of a person appearing.
The waiting time depends on the queue length. The maximum waiting time is the time the person who came last has to wait wrt size of the max queue length (times max processing time).
Average is similar.