# The important introduction

### Behavioral Questions

Interview Preparation Grid

The columns should represent all the major activities/jobs or activities. This grid is your study sheet for the behavioral interview. Simplify the grid to keywords to make it easier to learn from for the interview.

 Common Questions Project 1 Project 2 Project 3 Challenges Mistakes/Failures Enjoyed Leadership Conflicts What you'd do differently

1. Should be a real weakness
2. Don't say something like "working too hard"
3. Comes off as arrogant and unable to acknowledge faults

1. Genuine questions - you should actually want to know the answer
2. "What brought you to this company"
3. "What has been most challenging for you?"
4. Insightful questions - demonstrate knowledge/understanding of tech
5. "I noticed you use technology X. How do you handle problem y?"
6. Passion questions - shows your interest in learning and being a strong contributor
7. "I'm not familiar with technology X, but it sounds like a very interesting solution. Could you tell me a bit more about how it works?

Responding to Behavioural Questions

1. Be specific, not arrogant
2. Specificity is stating facts and letting the interviewer derive interpretation out of it
3. Instead of saying you did most of the work, mention what you did and let the interviewer derive meaning
4. Stay light on details, you're telling someone who doesn't know your project so don't just go off with details
6. The interviewer needs to know what you specifically did on a project so talking about "what we did..." isn't very informative
7. Give structured responses
8. Nugget - Start your response with a 'nugget' that succinctly describes what your response will be about
9. SAR (Situation, Action, Result)
10. Talk about the action, don't just get stuck on discussing the situation

### Big O Time

This is the language through which we discuss the efficiency of algorithms.

#### Analogy

The analogy the book uses can highlight Big O time for us.

If you had to deliver a file to someone across the country in the fastest way possible, how would you do it?

Initial thoughts might lead you to believe that emailing, FTP or some other electronic transfer will be the fastest. However, the book highlights this is only partially correct.

The time to electronically transfer a file to someone else isn't the same for a 1MB file vs 1TB file. Once the size of the file passes a certain threshold, it might actually be quicker to fly the hard drive containing the file to your destination.

Of course, this is an oversimplification of the concept as other variables are involved, but it does highlight how we must assess algorithms with asymptomatic runtime or Big O time in mind.

Electronic Transfer: ﻿$O\left(s\right)$﻿, where s is the size of the file. Time to transfer increases linearly (simplified for the sake of example) with the size of the file being transferred.

Airplane Transfer: ﻿$O\left(1\right)$﻿, where the time taken to transfer a 1TB file vs 1MB file doesn't vary because the plane will fly at the same rate and get to the destination at the same time.

You can apply multiple variables in runtime as well. The time need to eat your food might depend on how hot the food is ﻿$t$﻿﻿$t$﻿ and the size of the bowl being filled ﻿$b$﻿﻿$b$﻿represented in Big O time as ﻿$O\left(tb\right)$﻿.

BTime Complexity and ﻿$Big\ O,\ Big\ \Omega,\ Big\ \Theta$﻿This section is inspired by this video by 'the youtube channel Back to Back SWE'

. It is recommended to watch this video for extra clarity.All of these asymptotic concepts are in relation to ﻿$T\left(f\left(n\right)\right)$﻿ which is the actual time an algorithm takes.

Also, these concepts are described as asymptotic because we want to better understand their behavior as ﻿$n$﻿ in creases infinitely. Usually, these functions are graphed as a function of ﻿$n$﻿ inputs (x-axis) against ﻿$t$﻿ time (y-axis)﻿$Big\ O$﻿

1. Defines the upper bound, of a function, meaning the actual run time of an algorithm, ﻿$T\left(f\left(n\right)\right)$﻿ will approach but always be faster than ﻿$O\left(f\left(n\right)\right)$﻿
2. This is why ﻿$Big\ O$﻿ is sometimes described as the 'worst-case' because the actual run time will never be slower than itIf an algorithm is ﻿$O\left(n\right)$﻿, it is also ﻿$O\left(n^2\right),\ O\left(n^3\right)$﻿ and any runtime bigger than ﻿$O\left(n\right)$﻿

In industry, ﻿$Big\ O$﻿ is used synonymously with ﻿$Big\ \Theta$﻿

1. ﻿$Big\ \Theta$﻿﻿$Big\ \Omega$﻿SIt's the sme as ﻿$Big\ O$﻿ but the lower bound , meaning the run time of an algorithm, ﻿$T\left(f\left(n\right)\right)$﻿, will approach but always be slower than ﻿$Big\ \Omega$﻿﻿$Big\ \Theta$﻿
2. In contrast, this can be seen as the 'best-case because the actual run time will never be faster than itIf a ﻿$Big\ \Omega$﻿﻿$Big\ \Theta$﻿is ﻿$\Omega\left(n\right)$﻿ it is also ﻿$\Omega\left(\log\ n\right),\ \Omega\left(1\right)$﻿ and anything else that is faster

﻿$Big\ \Omega$﻿﻿$Big\ \Theta$﻿

1. This combines the previous two concepts and requires a﻿$f\left(n\right)$﻿ that would be a valid upper bound as well as lower bound for ﻿$T\left(f\left(n\right)\right)$﻿

#### Space Complexity

We don't just want to consider time an algorithm takes, we also want to consider the memory allocation/space usage of an algorithm.