Convert two binary numbers into two decimal numbers and compute their sum. Your program has to convert two binary numbers b1, b2 of 8 bits into two decimal numbers d1, d2 respectively. Then compute the result of (d1+d2).
Input File Format
The input consists of N cases. The first line of the input contains only one positive integer N indicating the number of test cases, followed by N following cases. Each case is exactly in one line with two binary numbers b1, b2 (at most 8 digits, and each digit is either 0 or 1) separated by one space. Note that 1 ? N ? 2147483647.
Output Format
For each case, print the result in one line.
Sample Input:
2
01010000 00000001
11110000 10000000
Sample Output:
81
368
- #include <stdio.h>
- #include <stdlib.h>
- #include <math.h>
- void binary(int *result);
- int main(int argc, char *argv[])
- {
- int i,k,a=0,b=0;
- char c;
- scanf("%d%*c",&k);
- for(i=0;i<k;i++){
- a=0, b=0;
- binary(&a);
- getchar();
- binary(&b);
- getchar();
- printf("%d\n",a+b);
- }
- return 0;
- }
- void binary(int *result){
- char c;
- int i=0;
- for(i=0;i<8;i++){
- scanf("%c",&c);
- *result+=(int)floor(pow(2,7-i))*(c-'0');
- }
- }