There is a sequence of n+2 elements a
0, a
1, ..., a
n+1 (n <= 3000, -1000 <= a
i <=1000). It is known that
ai = (ai-1 + ai+1)/2 - ci for each i=1, 2, ..., n.
You are given a
0, a
n+1, c
1, ... , c
n. Write a program which calculates a
1.
The first line of an input contains an integer n. The next two lines consist of numbers a
0 and a
n+1 each having two digits after decimal point, and the next n lines contain numbers c
i (also with two digits after decimal point), one number per line.
The output file should contain a
1 in the same format as a
0 and a
n+1.