Logic to count intersections in a given set is correct for no repetitive pattern, but for this set, the output count is wrong.
JavaScript
x
int a[] = {1,1,1,2};
int b[] = {1,1,2,2,3};
int count = 0;
for(int i=0;i<a.length;i++) {
for(int j =0;j<b.length;j++){
if(a[i]==b[j]) {
count++;
}
}
}
Code output is 8 Expected output to be 3
Advertisement
Answer
Try the following code. I use Math.min() for the second statement of for-loop to avoid IndexOutOfBounds error
JavaScript
public static void main(String[] args) {
int a[] = {1,1,1,2};
int b[] = {1,1,2,2,3};
int count = 0;
int min_len = Math.min(a.length, b.length);
for(int i=0;i< min_len;i++) {
if(a[i]==b[i]) {
count++;
}
}
System.out.println(count);
}
Output is 3