I was asked to prove or disprove the following conjecture: n^2 = Ω(nlogn) This one feels like it should be very easy, and intuitively it seems to me that because Ω is a lower bound function, and n^2 is by definition of higher magnitude than nlogn, then it is also...

I am working through programming problems from InterviewCake[1] and this problem[2] is confusing me. I have an array stock_prices_yesterday where: - The indices are the time, as a number of minutes past trade opening time, which was 9:30am local time. - The values are the price of Apple stock at...

I have been practicing analyzing algorithms lately. I feel like I have a pretty good understanding of analyzing non-recursive algorithms but I am unsure, and have just begun to embark on a full understanding of recursive algorithm as well. Although, I have not had a formal check on my methods...

The following is a homework assignment, so I would rather get hints or bits of information that would help me figure this out, and not complete answers. Consider S an algorithm solution to a problem that takes as input an array A of size n. After analysis, the following conclusion...

I have the following function to prove that its time complexity is less or equal to O(xlogx) f(x) =xlogx+3logx2 I need some help to solve this....

I have been asked to prove or disprove the following conjecture: For any given constant c>0 | If f(n) = O(h(n)) then c*f(n) = O(h(n)) I have came up with the following counter example: Let f(n) = n and c = n+1. Then c*f(n) = (n+1)n = n^2+n = O(n^2),...

I know that measuring asymptotic complexity can be based on any resources you have, whether it's time, memory usage, number of comparisons, etc. But when it comes to sorting something, I realize we normally associate the asymptotic notation with the basic operations like number of swaps/steps or number of comparisons....

I need to prove or disprove the following conjecture: if f(n) = O(h(n)) AND g(n) = O(k(n)) then (f − g)(n) = O(h(n) − k(n)) I am aware of the sum and product theorems for growth combination, but I could not find a way to apply them here, even though...

I am doing this small task which I have to arrange asymptotic runtime in ascending order. Here are the runtimes: Here is the order I believe they should go in: log10(n^4), n^3, 2^((log4n)), 2^(100n), e^pi^4096, n! + 12^1000 Is this correct? Or are there any errors? Thanks!...

I am working through analysis of algorithms class for the first time, and was wondering if anyone could assist with the below example. I believe I have solved it for an O(n) complexity, but was wondering if there is a better version that I am not thinking of O(logn)? Let...

The increasing order of following functions shown in the picture below in terms of asymptotic complexity is: (A) f1(n); f4(n); f2(n); f3(n) (B) f1(n); f2(n); f3(n); f4(n); (C) f2(n); f1(n); f4(n); f3(n) (D) f1(n); f2(n); f4(n); f3(n) a)time complexity order for this easy question was given as--->(n^0.99)*(logn) < n ......how?...

If we have two arrays of size n each and want to sort their sums, the naive approach would be to store their sums in O(n^2) space and sort it in O(n^2 logn) time. Suppose we're allowed to have the same running time of O(n^2 logn), how would we store...

I have an array with, for example, 1000000000000 of elements (integers). What is the best approach to pick, for example, only 3 random and unique elements from this array? Elements must be unique in whole array, not in list of N (3 in my example) elements. I read about Reservoir...

I would like to understand Asymptotic Analysis better since I believe I don't have solid understanding on that. I would appreciate if someone can highlight a better approach to it. Here are two examples for (int i = 1; i <= n; i *= 2) { for (int j =...

Let A[1, …, n] be an array storing a bit (1 or 0) at each location, and f(m) is a function whose time complexity is θ(m). Consider the following program fragment written in a C like language: Case 1 :- counter = 0; for (i = 1; i < =...

Okay this might be the worst way way to sort an array arr of n distinct integers but I want to analyse this algorithm: Check if arr is sorted. If so, return. Randomly permute the elements of arr. Repeat Steps 1 and 2 until there is a return. Will Goofy’s...

I'm really getting frustrated about solving the Recurrence above. I was trying to solve it by using the Master Method, but I just didn't get it done... I'm having a recursive algorithm that takes 3log n tim (three binary searches) to identify four subproblems, each with with a size of...

Hi i am trying to find out the big-O of this algorithm. I think it is n^2 but because the size of the sub loop is shrinking each time I am not sure. for(int i= 0; i < SIZE; i++){ for(int j = i; j < SIZE; j++) { //Code...

Consider the following C function: int fun1 (int n) { int i, j, k, p, q = 0; for (i = 1; i<n; ++i) { p = 0; for (j=n; j>1; j=j/2) ++p; for (k=1; k<p; k=k*2) ++q; } return q; } The question is to decide which of the...

I am wonder how to exactly find the tight upper bound for T(n)? for one example below: T(n)=T( n/2 + n(1/2)) + n. I am not that sure how to use the domain or range transform here. I use the domain transform here. let n = 22k ==> n/2 =...

Time Complexity of a loop is considered as O(Logn) if the loop variables is divided / multiplied by a constant amount. loop 1 ---- for (int i = 1; i <=n; i *= c) { // some O(1) expressions } loop 2 ----- for (int i = n; i >...

What are the best case and worst case asymptotic running times for sorting an array of size n using mergesort ?...

Based on this radix sort article http://www.geeksforgeeks.org/radix-sort/ I'm struggling to understand what is being explained in terms of the time complexity of certain methods in the sort. From the link: Let there be d digits in input integers. Radix Sort takes O(d*(n+b)) time where b is the base for representing...

From my understanding a hashmap insertion is O(1) and for an arraylist the insertion is O(n) since for the hashmap the hashfunction computes the hashcode and index and inserts the entry and an array list does a comparison every time it enters a new element.

Given two functions: f(n)=O(log2n) and g(n)=O(log10n) Does one of these dominate the other?...

Note: This is problem 4.3 from Cracking the Coding Interview 5th Edition Problem:Given a sorted(increasing order) array, write an algorithm to create a binary search tree with minimal height Here is my algorithm, written in Java to do this problem public static IntTreeNode createBST(int[] array) { return createBST(array, 0, array.length-1);...

x=0 for i=1 to ceiling(log(n)) for j=1 to i for k=1 to 10 x=x+1 I've included the answer I've come up with here: I think the time complexity is θ(n^2 log(n)), but I am not sure my logic is correct. I would really appreciate any help understanding how to do...