You are given an array of unique integers salary where salary[i] is the salary of the ith employee.
Return the average salary of employees excluding the minimum and maximum salary. Answers within 10-5 of the actual answer will be accepted.
Example 1:
Input: salary = [4000,3000,1000,2000] Output: 2500.00000 Explanation: Minimum salary and maximum salary are 1000 and 4000 respectively. Average salary excluding minimum and maximum salary is (2000+3000) / 2 = 2500
Example 2:
Input: salary = [1000,2000,3000] Output: 2000.00000 Explanation: Minimum salary and maximum salary are 1000 and 3000 respectively. Average salary excluding minimum and maximum salary is (2000) / 1 = 2000
Constraints:
3 <= salary.length <= 1001000 <= salary[i] <= 106- All the integers of
salaryare unique.
Soluation:
In Javascript:
var average = function(salary) {
let maxSalary= salary[0], minSalary= salary[0], sum=0, count = 0;
for(let i=0; i<salary.length; i++){
if(maxSalary<salary[i])
maxSalary = salary[i];
if(minSalary>salary[i]){
minSalary = salary[i];
}
}
for(let i=0; i<salary.length; i++){
if(maxSalary != salary[i] && minSalary != salary[i]){
sum += salary[i];
count++;
}
}
let result = sum/count;
return result;
};
No comments:
Post a Comment