Tumgik
softwarestan · 4 years
Text
Tumblr media
0 notes
softwarestan · 4 years
Text
Tumblr media
0 notes
softwarestan · 4 years
Text
Tumblr media
0 notes
softwarestan · 4 years
Text
Tumblr media
0 notes
softwarestan · 4 years
Text
Tumblr media
0 notes
softwarestan · 4 years
Text
reup
Tumblr media
0 notes
softwarestan · 4 years
Text
Tumblr media
0 notes
softwarestan · 4 years
Text
Performance
Tumblr media
0 notes
softwarestan · 4 years
Text
Note
Tumblr media
0 notes
softwarestan · 4 years
Text
Testing Plan
The testing plan or a performance test plan typically it is a written document though other formats could be possible. It could be written for non project activities. 
 § Maintenance test plan. 
 ○ Coverage testing: 
 § Statement coverage: 
 □ Your test suit makes the source code execute 80% of the statements.  
 § Branch coverage: 
 □ Lighter notion that addresses that when there is a branch each alternative has to be coverage. It is covered if you have tests that make it go to one side and one for the other side. 
 § General idea: How well does your test suite exercise your code base. § In your unit tests you can have ideally to cover 90% of statement coverage. 
 ○ Agile corporation is a large publication organization producing magazines and books. 
 ○ This plan is available on the  project portal and the newest version is also posted in the top right corner of the storyboard in the development room. 
 ○ Test plan for New subscription system (NSS) Vers.: Iteration 3 
 § Covers: 
 □ NSS iteration 3 result and stories, including result of the previous iterations § People: 
 □ Each iteration is carried out by the team consisting of developers, user representatives and testers. The developers ultimately refer to the Head of Development (Ursula), ad the testers to the Head of QA (Stephan). 
 § Risks: 
 □ You don’t have to list them here. 
 □ The specific risks for this iteration are listed on the story cards. The general risks is that the iteration team does not have access to live data n the supporting databases. 
 § Test strategy: 
 ® Create automated tests based on stories before coding starts ® Retest every time something has changed  
 ® Estimate and cost testing and development 
 ® Use the test design techniques most appropriate to the acceptance criteria 
® Ensure and verify that testing achieves statement coverage of at least 90% of all code.
0 notes
softwarestan · 4 years
Text
Test Automation
Test automation reduces the margin of error, more time to spare, etc. 
 • What do we have to do with automated tests instead of manual testing?  
 ○ We have to make the code etc. that automates the tests creating overhead. Automated tests is only worth it if we reuse the tests often.
• What are some myths of test automation?
 ○ It discovers most defects than manual testing, it can only detect the defects that we already know of and code for them. The ones we haven't seen we cannot automate and if a tester is experienced they will have a good idea of where developers might make mistakes. 
 ○ Automation fixes the development process:
 § If you have a deficient development process and you automate it, you'll have an automated deficient development process lol. 
 § To fix this integrating acceptance tests in earlier stages it makes sure you discover things before the release date.  
 ○ Automated testing is faster than manual testing 
 ○ Every test needs to be automated 
 § Not every test can be automated. 
 • What are hidden costs in test automation? 
 ○ Maintenance. The tests need access to the source code, if anything changes then you have to also change the automation process.  
 § Do the interfaces evolve? 
 □ Yes and as they evolve the tests should as well. 
 • What are benefits of automated testing?
 ○ You need to take care and make them realistic. Lets say you identify that end users on average send one request to the system every 5 seconds. If you simulate that then the VMs may be in sync and send one request at the beginning of the 5 seconds and itll be either more load or less load but not equal to what it is in actuality.  
 ○ Load/performance testing 
 § Its unreasonable to have 300 client machines setup and manually test for the load/performance of them. 
 ○ Smoke testing  
 § You select important tests from your test suite and those are the ones you implement in every iteration instead of the whole set of test cases.
 ○ Setting up test data and pre-test context 
 § Fixtures, automate that
0 notes
softwarestan · 4 years
Text
Acceptant Testing
xAT: Which are there? 
 ○ There is uAT: User Acceptant Testing. 
 § Done by the users. 
 § Can entail testing processes around the software. Not only that the software is tested but that all the workflows with the new software in place work. 
 ○ Factory Acceptant Testing is the testing that the developers do which is in the office etc. 
 § You could argue it is not the user and instead someone pretending to be the user but on the other hand you might wanna have such a stage in order to have those tests that the user should have and see that they all pass in the context/office of the developer.  
 ○ Site Acceptant Testing which is done on site in the infrastructure of the client. § Done by developers or user. • Main purpose of the acceptant testing: 
 ○ That the software does what the clients wants it to do. You’ve done your part now you get to check basically. 
 ○ Acceptance consists of the transfer of ownership/responsibility also conclusion of the developer and therefor due of the payment.  
 ○ The purpose: Is the contractual obligation fulfilled? • How much effort should we expect this to cause? 
 ○ Ideally no effort. You have worked with the tests and much of the testing is done as the project does its iterations because it is integrated into the test suites. So if you even have continuous deployments you could even have it integrated in your iterations. 
 • What are additional aspects/deliverables required here? 
 ○ They need a demo, manual, support contracts, training materials, modified business processes that are impacted by the integration of the software to the business.
0 notes
softwarestan · 4 years
Text
Stubs, mocks, scaffolding
• How do we test A which relied on B, before B is ready?   
○ Without changing the structure of the main function, we can change for example a function inside it that has not been done with a return of dummy values.   
○ Stubs/Mock are pretty much the same thing, but essentially they're a pre-done function that has hard coded values/results instead of the actual function.    § Stub: you've actually started writing B, you're just not confident that B is correct yet, so you comment the body and replace it with the hard coded values.    
 § Mock: you haven't even started with B and you create the hard coded function.   
○ By not modifying A and only changing B, we then have the result of the main function be the "correct" one if all the tests passes with B, and if errors should arise then they are due to B and can be pointed out.   
○ Fixture is the code and the setup required to run a test. Example: Data base with dummy variables, the client side, the window that will contain the template etc. It should be under the test folder and not the main one.     
§ Either you avoid tests that destroy or change the fixture so that following tests are not affected by the previous change in code. Or you have to make sure that after destroying the fixture, you reset everything for the next test.  
0 notes
softwarestan · 4 years
Text
Testing Notes
How would we attempt to test the property "no memory leaks"? 
 • Source code instrumentation: for example create our own memory allocation so that we can then record every call and then have a table with all the calls that were made and where to see if there is any leaks. 
 • Example: Dmalloc  Unit, integration and system test: 
 • Unit tests: ○ We are treating something as a unit and the behavior as well.  
○ It may be a single class, file or even the whole application 
 • Integration tests: ○ It is about testing the interfaces 
 ○ If you have one function calling another function that is an interface ○ The object-oriented hierarchy provides an interface, if we have inheritance and inheriting functions, we have an interface for the relationship between them and we can test those too.
 ○ How do we test those? 
 § It is hard but what you can do is work with mutation tests and they can show you whether you have a sufficient number of tests that probe the interface.  
 § For example something to test would be a function that takes two inputs (neededScoresToPassClass, actualScoresOfClass). And a interface test would be to see that both values are passed in the correct order. 
 § Integration mutation:
 □ Modify values in that call 
 □ Change method called 
 □ Change value received in parameter/value returned to caller 
 □ Change return statements
 □ Object Orient features: e.g. access modifier change or insert hiding variable Scalameta is a source code analyzer/transformer tool that we can use to create an actual mutation integration tool.
0 notes
softwarestan · 4 years
Text
White-box Testing
White-box testing is when we actively use knowledge about the source code.
• How does static analysis differ from dynamic analysis? 
 ○ Static analysis is what we do before the program is compiled/run. 
 ○ Dynamic is when we do something as the program is ran/executed. 
 • Lint-type analyses: 
 ○ A compiler gives you syntax errors when you do a mistake. It also warns you when something could be wrong even though it is not an error.  
 ○ This mainly works for coding conventions for example analysis of camelCase vs this_case, etc.  
 ○ It is customizable with company standards etc. It can be used as a pre-commit test as well.
How do syntactic analyses differ from semantic analyses?   
○ Syntactic analysis uses grammar rules to analyze the syntax:   
 § We have logical expressions without using a parenthesis for example   
○ Semantic analyses is more so the meaning. For example the types:    
§ The scala compiler has plugins and configuration values to write the semantic generated by the compiler etc. to a file so that we can then use and analyze that.   
○ The syntactic analysis makes use of what is a correct program and the semantic analysis makes use of all the information that the compiler has generated when it compiled the program. The semantic needs the program to be compiled and the syntactic can just use the grammar rules etc without running it.  
• What are ways to implement peer inspection/review on a project?   
○ We don’t always need the software to do white-box testing. We can use peer inspection/review   
○ To implement this we could use pair-programing where you do peer inspection/review as the code is being created or  Code reviews when you sit people together and then inspect the code and explain what and why you did things.  
0 notes
softwarestan · 4 years
Text
What is the difference between alpha testing and beta testing?
• In beta testing we assume the whole functionality is there and that there are bugs etc. and we know how to fix them but it requires time.
 • Beta testing is normally done with users that volunteer to have an early stage of the game or software etc.  Alpha testing is a much earlier stage and it is not done by the developers but instead a group of people designated for it, but this group of people is not normally the end user.
0 notes
softwarestan · 4 years
Text
Mutation Testing
• Automatic changes to the source code, compilation and testing of the new source code. 
 • There are several mutation testing libraries that you can use: ○ One originally comes from Java called Stryker: 
 § You don’t just generate arbitrary strings because then the amount of strings you can generate is enormous and most don’t pass the compiler. We wanna produce syntactically correct source code.  
 § One example of rules you can configure is allowing && to be converted into ||.
§ The mutants are copies of the source code with the change done on an arbitrary value according to the rules and mutation testing will kill the mutants when they don’t pass the tests. 
0 notes