FAQ Database Discussion Community


Asymptotic notation and Growth of Combinations of Functions: Difference

algorithm,asymptotic-complexity,proof,growth-rate
I need to prove or disprove the following conjecture: if f(n) = O(h(n)) AND g(n) = O(k(n)) then (f − g)(n) = O(h(n) − k(n)) I am aware of the sum and product theorems for growth combination, but I could not find a way to apply them here, even though...

Proposed analysis of algorithm

algorithm,recursion,time-complexity,asymptotic-complexity
I have been practicing analyzing algorithms lately. I feel like I have a pretty good understanding of analyzing non-recursive algorithms but I am unsure, and have just begun to embark on a full understanding of recursive algorithm as well. Although, I have not had a formal check on my methods...

What is the time complexity of the given algorthm?

algorithm,time-complexity,complexity-theory,asymptotic-complexity,big-theta
x=0 for i=1 to ceiling(log(n)) for j=1 to i for k=1 to 10 x=x+1 I've included the answer I've come up with here: I think the time complexity is θ(n^2 log(n)), but I am not sure my logic is correct. I would really appreciate any help understanding how to do...

Do log bases matter in Big O domination?

big-o,asymptotic-complexity
Given two functions: f(n)=O(log2n) and g(n)=O(log10n) Does one of these dominate the other?...

Storing pairwise sums in linear space

arrays,algorithm,sorting,big-o,asymptotic-complexity
If we have two arrays of size n each and want to sort their sums, the naive approach would be to store their sums in O(n^2) space and sort it in O(n^2 logn) time. Suppose we're allowed to have the same running time of O(n^2 logn), how would we store...

If f(n) = O(h(n)) then c*f(n) = O(h(n)) for all c > 0 - proof challenged?

algorithm,asymptotic-complexity,proof,growth-rate
I have been asked to prove or disprove the following conjecture: For any given constant c>0 | If f(n) = O(h(n)) then c*f(n) = O(h(n)) I have came up with the following counter example: Let f(n) = n and c = n+1. Then c*f(n) = (n+1)n = n^2+n = O(n^2),...

Asymptotic Analysis for nested loop

algorithm,asymptotic-complexity
I would like to understand Asymptotic Analysis better since I believe I don't have solid understanding on that. I would appreciate if someone can highlight a better approach to it. Here are two examples for (int i = 1; i <= n; i *= 2) { for (int j =...

TIme complexity of various nested for loops

loops,for-loop,big-o,time-complexity,asymptotic-complexity
Time Complexity of a loop is considered as O(Logn) if the loop variables is divided / multiplied by a constant amount. loop 1 ---- for (int i = 1; i <=n; i *= c) { // some O(1) expressions } loop 2 ----- for (int i = n; i >...

Big O notation for brute force solution

big-o,asymptotic-complexity
I am working through programming problems from InterviewCake[1] and this problem[2] is confusing me. I have an array stock_prices_yesterday where: - The indices are the time, as a number of minutes past trade opening time, which was 9:30am local time. - The values are the price of Apple stock at...

HashMap vs. ArrayList insertion performance confusion

java,arraylist,hashmap,time-complexity,asymptotic-complexity
From my understanding a hashmap insertion is O(1) and for an arraylist the insertion is O(n) since for the hashmap the hashfunction computes the hashcode and index and inserts the entry and an array list does a comparison every time it enters a new element.

Why does this loop return a value that's O(n log log n) and not O(n log n)?

loops,for-loop,time-complexity,nested-loops,asymptotic-complexity
Consider the following C function: int fun1 (int n) { int i, j, k, p, q = 0; for (i = 1; i<n; ++i) { p = 0; for (j=n; j>1; j=j/2) ++p; for (k=1; k<p; k=k*2) ++q; } return q; } The question is to decide which of the...

Asymptotic analysis of functions

algorithm,time-complexity,complexity-theory,asymptotic-complexity
I have the following function to prove that its time complexity is less or equal to O(xlogx) f(x) =xlogx+3logx2 I need some help to solve this....

Big-O Computational Resources

algorithm,sorting,big-o,computer-science,asymptotic-complexity
I know that measuring asymptotic complexity can be based on any resources you have, whether it's time, memory usage, number of comparisons, etc. But when it comes to sorting something, I realize we normally associate the asymptotic notation with the basic operations like number of swaps/steps or number of comparisons....

Asymptotic complexity for typical expressions

time-complexity,asymptotic-complexity
The increasing order of following functions shown in the picture below in terms of asymptotic complexity is: (A) f1(n); f4(n); f2(n); f3(n) (B) f1(n); f2(n); f3(n); f4(n); (C) f2(n); f1(n); f4(n); f3(n) (D) f1(n); f2(n); f4(n); f3(n) a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how?...

Problems Solving Recurrence T(n) = 4T(n/4) + 3log n

performance,algorithm,recursion,asymptotic-complexity
I'm really getting frustrated about solving the Recurrence above. I was trying to solve it by using the Master Method, but I just didn't get it done... I'm having a recursive algorithm that takes 3log n tim (three binary searches) to identify four subproblems, each with with a size of...

Best algorithm to find N unique random numbers in VERY large array

performance,algorithm,big-o,bigdata,asymptotic-complexity
I have an array with, for example, 1000000000000 of elements (integers). What is the best approach to pick, for example, only 3 random and unique elements from this array? Elements must be unique in whole array, not in list of N (3 in my example) elements. I read about Reservoir...

Complexity of a random sorting

algorithm,asymptotic-complexity
Okay this might be the worst way way to sort an array arr of n distinct integers but I want to analyse this algorithm: Check if arr is sorted. If so, return. Randomly permute the elements of arr. Repeat Steps 1 and 2 until there is a return. Will Goofy’s...

Algorithms Asymptotic running times

algorithm,mergesort,asymptotic-complexity
What are the best case and worst case asymptotic running times for sorting an array of size n using mergesort ?...

Best and worst case time for Algorithm S when time complexity changes in accordance to n being even/odd

algorithm,big-o,asymptotic-complexity,growth-rate
The following is a homework assignment, so I would rather get hints or bits of information that would help me figure this out, and not complete answers. Consider S an algorithm solution to a problem that takes as input an array A of size n. After analysis, the following conclusion...

Have I properly sorted these runtimes in order of growth?

math,big-o,time-complexity,asymptotic-complexity
I am doing this small task which I have to arrange asymptotic runtime in ascending order. Here are the runtimes: Here is the order I believe they should go in: log10(n^4), n^3, 2^((log4n)), 2^(100n), e^pi^4096, n! + 12^1000 Is this correct? Or are there any errors? Thanks!...

Time complexity of if-else statements in a for loop

if-statement,for-loop,time-complexity,asymptotic-complexity
Let A[1, …, n] be an array storing a bit (1 or 0) at each location, and f(m) is a function whose time complexity is θ(m). Consider the following program fragment written in a C like language: Case 1 :- counter = 0; for (i = 1; i < =...

Would this algorithm run in O(n)?

algorithm,big-o,time-complexity,complexity-theory,asymptotic-complexity
Note: This is problem 4.3 from Cracking the Coding Interview 5th Edition Problem:Given a sorted(increasing order) array, write an algorithm to create a binary search tree with minimal height Here is my algorithm, written in Java to do this problem public static IntTreeNode createBST(int[] array) { return createBST(array, 0, array.length-1);...

Radix sort explanation

algorithm,sorting,asymptotic-complexity,radix
Based on this radix sort article http://www.geeksforgeeks.org/radix-sort/ I'm struggling to understand what is being explained in terms of the time complexity of certain methods in the sort. From the link: Let there be d digits in input integers. Radix Sort takes O(d*(n+b)) time where b is the base for representing...

Asymptotic notation: How to prove that n^2 = Ω(nlogn)?

algorithm,asymptotic-complexity,proof,growth-rate
I was asked to prove or disprove the following conjecture: n^2 = Ω(nlogn) This one feels like it should be very easy, and intuitively it seems to me that because Ω is a lower bound function, and n^2 is by definition of higher magnitude than nlogn, then it is also...

How to find the asymptotically upper bounds for T(n) in the recurrences?

algorithm,big-o,asymptotic-complexity
I am wonder how to exactly find the tight upper bound for T(n)? for one example below: T(n)=T( n/2 + n(1/2)) + n. I am not that sure how to use the domain or range transform here. I use the domain transform here. let n = 22k ==> n/2 =...

Analysis of Algorithms - Find missing Integer in Sorted Array better than O(n)

arrays,algorithm,sorting,asymptotic-complexity
I am working through analysis of algorithms class for the first time, and was wondering if anyone could assist with the below example. I believe I have solved it for an O(n) complexity, but was wondering if there is a better version that I am not thinking of O(logn)? Let...

Big O with removing an element each time

big-o,asymptotic-complexity
Hi i am trying to find out the big-O of this algorithm. I think it is n^2 but because the size of the sub loop is shrinking each time I am not sure. for(int i= 0; i < SIZE; i++){ for(int j = i; j < SIZE; j++) { //Code...