Tumgik
#and the *plays record* piloting system. self explanatory
paarthbhasin · 5 years
Text
Week 9 Lecture Notes
Morning Lecture:
Just Culture: Instead of blaming an individual, you look at the whole system.
Upsides of blaming people:
You become happy, one person take care of, easier.
You can sack them.
Downsides:
You don’t fix the actual problem.
“Shoot the messenger”: sack / kill the person the person who brings the bad news. 
You can’t just keep firing people. It will never fix the problem
NOTE: Adopting this culture worked for Qantas. They put a very strong emphasis on Just Culture, and because of this they are such a reputable airline in Australia and the world. They have been topping the rankings of air safety for a long time in Australia due their highly effective company audits focussing on improving the system, not just one individual.
A just culture refers to a values-supportive model of shared accountability.
Else: Fault culture, which is full of toxic people.
SECURITY ENGG:
What is it about engineering culture that makes it so successful?
Time: 
things being built over time
ability to observe failures and learn from them
high frequency errors, and medium frequency errors
Methodology & Standards:
Best practices
Not starting from scratch
Ex: Takoma bridge didn’t follow such practices and was built from scratch
It was wavy and unstable as a result. it got destroyed.
Using a checklist, which stops you from overlooking things.
Examples:
3-Mile island: 2 valves, no one noticed it. A new person came in he could see it. If there was a standard procedure in place, this could have been found out faster.
Cockpit recordings: For planes that crashed, you can hear what the pilots were doing prior to crashing: following a checklist of things to ensure plane survival.
Creativity / Dealing with uncertainty
Thinking outside the box
Dealing with issues and unpredictibilty
Testing
Science when done well
following procedure to prove it wrong
continuous improvement
more confidence in yourself and your work.
We want to find mistakes
keep evidence
Software Development:
Ship, ship, ship: Be agile
At the cost of shedding security
RICHARD’S LIST (Additional to the ones already mentioned)
Focus on process
Review at every level (even your own work)
Conflict of interest
Professionalism
Serve the profession, not just your boss.
Dealing with conflict of interest
You’re a son, a father, an employee. 
You have make priority decisions some times.
Quantifying results:
Danger: Not everything worth measuring can be measured. Not everything that can be measured is worth measuring.
Assurance: Confidence on our beliefs and work quality
Closing the loop
Feed forward system, so many things
No feedback
Instead give feedback after on forward cycle is complete
You check the idea
Listening and improving
350 ALARMS WENT OFF
Something went wrong
Only 8 alarms were important
Focus on the right things
Security is state of mind
SYSTEM PROPERTIES:
Solving problem is noticing little things and fixing them?
No
You need to understand the entire system.
What is coherence?
We need the system to be:
Coherent
Every part of the system is working on achieving the same thing. No waste of work or effort. They are doing meaningful things to achieve something.
Not complex
Not coupled
Wargames Clip
Pull humans out of the loop
Automate things
systems are not perfectly designed
Preconditions behave as expected before postconditions
Attackers’ job: make preconditions be in an imperfect state.
Human creativity is needed to solve such problems.
ASDs are good, they can follow a checklist
But just ASDs are no good. We need humans to be able to think out of the box, which ASDs are not capable of.
China Syndrome: Consequences of having humans in the loop
Just knowing your technical stuff is not enough to be a capable engineer. We need to know sociology, psychology etc. in order to understand people. This is a highly important skill to have.
To be a security engineer, you need to think overall, removing mist and focussing on important things.
Society itself is a system
Society being hacked / spoilt
Nazi propaganda
Trump selection for president
Brexit
Security is end to end:
From start of voting
Chaining of votes (Russia kicks in here)
To collection
To announcing
Work to undermine limits. 
Anyone in power, works in increasing their power, subverting control.
Ruler of Galaxy:
Multilateral Security: Each layer controls the layer beneath it
Person at the top controls everything. Single Point of failure.
One brain with many bodies.
Better to have multiple brains and people
You don’t get to know easily that you’re the ruler. You don’t want to be the ruler
Someone who wants power is excluded from having power.
Assange & Free speech:
You want free speech but not if it disagrees with your beliefs and situations.
Powerful people don’t like speech against them.
Self driving Cars and Trolley Cars:
Cars kill people
We never have a perfect system With computers controlling, we could reduce this. Pay the price.
Computers have to make ethical decisions
Kill a boy or an old lady?
This seems difficulty for them to decide
How can we ever be truly safe?
How does a car know speed-limit?
Central DB => GPS
If you hack it, then you can play with the limits and cause harm.
Or cars work it out from signage: Computer Vision
Third Parties’ Security
Trust: End to end security. 
No trusted third parties
Building system, at some point you say use trusted third party.
Then no need to worry about security.
But who can you trust?
Moles / insiders
Good inside / bad outside
What can you do?
need walls everywhere
trusted third parties break trust?
Trusted third party vs roll your own security (not always possible)
Make choice
We set things up so they fail, but we ensure when they fail the impact is low.
Examples:
Buying missiles, have a backdoor built-in to the missile
To stop the builder from being attacked
YOUR FUTURE:
A new world forming
World needs Cyber Security Engineers.
EFFECTIVE COMMUNCATION:
Be brief
Explain simply
All about the listener not the speaker.
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - 
Evening Lecture (Reverse Engineering Seminar):
Most of the things in the seminar were pretty much self explanatory except Anti RE. So I am explaining only anti RE techniques:
It is used to ensure that your code / executable is not crackable. There are various ways you can do this:
Don’t release the debug build.
Gives attacker information of the inside of your executable
Trick the disassembler:
Excessive jump instructions or Dummy instructions
Movfuscator is a tool that does this. Makes the binary highly confusing to read.
Overlapping of instructions 
Used to hide the actual program structure. Instead of seeing the function return, you make the disassembler continue, and it sees the return at the incorrect place. This is because it is never able to figure out that the actual return instruction was mixed in between another instruction
Like: add eax, 0xc3
A previous instruction jumps to the 0xc3, which is the return instruction in x86. Disassembler cannot figure this out.
CHINA SYNDROME MOVIE was the last part. It is assessible content, as situations from the movie could form case studies.
0 notes
cacaitos · 3 years
Text
like the reason cared so much  it’s because 1. watching things while the’re airing feels different 2. i was genuenly intrigued up to some point, i thought it was going do or say something interesting inspite of the-- uh- piloting method (continuous shitty shounen exposure builds immunity yk i could get past that at the time)
0 notes