- Function is represented as Object in Javascript
- Has 2 phases – Function Definition and Function Execution
- Two ways of defining function
Function Declaration / Named Function – Function object would get created at scope creation phase
Function Expression / Anonymous Function – Function object would get created at execution phase – Interepreter would throw error incase the function is called before anonymous definition.//Named Function displayAge(); function displayAge(){ console.log('My Age is 33') } //Anonymous Function var age = function(){ //Context/scope execution phase console.log('My Age is 33') } age();
- No concept of function overloading. Function with near matching argument would be called.In the below code getSum has 2 arguments but still it gets called.
function getSum(num1, num2) { console.log('Function overloading is not possible'); } getSum();
Function overloading is not possible
- Function namespace context would be created with the samename as the function namespace
- In the below code the getLunch appears to be overloaded but there would be only one namespace in context with name getLunch
- So you may expect the output to be different but all the times getLunch(buffey, paid) would be called in below code
function getLunch() { console.log('Free Lunch'); } function getLunch(paidLunch) { console.log('paidLunch'); } function getLunch(buffey, paid) { console.log('paidLunch buffey'); } getLunch(); getLunch(5); getLunch(5,10);
Output
paidLunch buffey paidLunch buffey paidLunch buffey
- So what would be the workaround. Check the code as below
function getLunch() { if(arguments.length === 0) console.log('Free Lunch'); if(arguments.length === 1) console.log('paidLunch'); if(arguments.length === 2) console.log('paidLunch buffey'); } getLunch(); getLunch(5); getLunch(5,10);
Output
Free Lunch paidLunch paidLunch buffey
- Using Restparameter feature from ECMAScript6
function getLunch(bill, space, ...menu) { console.log(bill); console.log(space); console.log(menu); } getLunch(150, 'Open Terrace', 'idly', 'dosa', 'vada');
Output
150 Open Terrace ["idly", "dosa", "vada"]
Everything about pH
Everything about pH – Acidic or Alkaline
- pH is the measure of acidity or alkalinity of soil.pH varies between 1 to 14. 1 being most acidic and 14 being most alkaline. 6.5 to 7 is considered as neutral
- pH varies between 1 to 14. 1 being most acidic and 14 being most alkaline. 6.5 to 7 is considered as neutral
- Plants extract iron from the soil by roots. If the soil is alkaline irons bound to the soil.Depending on soil pH mineral bound to soil particle or make it soluble for uptake by plant
- Hydrogen ions are found at very low level. 0.0000001 Molar which is (log10 -7) pH7.pH is concentration of hydrogen ions. The more hydrogen ions are loosely available the lower the pH. The soil would be more acidic not alkaline.
Low the soil pH
Soil that is too acid (having a low Ph between 1.0 and 6.0) will show the following symptoms caused by increased availability of aluminum and a
decreased availability of phosphorus
- wilting leaves
- stunted growth of plant and/or root
- yellow spots on the leaves that turn brown and lead to leaf death
- blighted leaf tips
- poor stem development
High the soil pH
Soil that is too alkaline (having a high Ph between 8.0 and 14.0) will show the following symptoms caused by the plants inability to absorb iron. Phosphorus is
also not readily available and the micronutrients zinc, copper and manganese are also in limited supply.
- Interveinal chlorosis- (light green or yellowing of the leaf with green veining)
- General leaf discoloration
From the ph scale below, certain plants thrive in slightly acidic or slightly alkaline conditions. If you see your asparagus, cauliflower, lettuce, parsley
and spinach thriving you may have more alkaline conditions if your plants like radishes, sweet potatoes, peppers, and carrots are
struggling since they thrive in more acidic conditions and vice versa.
Chlorosis in Plants
Chlorosis is a yellowing of leaf tissue due to a lack of chlorophyll. Possible causes of chlorosis include poor drainage, damaged roots,
compacted roots, high alkalinity, and nutrient deficiencies in the plant. Nutrient deficiencies may occur because there is an insufficient amount in the soil or because the nutrients are unavailable due to a high pH (alkaline soil). Or the nutrients may not be absorbed due to injured roots or poor root growth.
Chlorosis can be because of iron deficiency(called just chlorosis) or nitrogen deficiency(interveinal chlorosis)
Iron deficiency or Intervenial Chlorosis
Interveinal chlorosis is a yellowing of the leaves between the veins with the veins remaining green. . A lack of iron in the soil can cause interveinal chlorosis but so will a number of other soil issues. Just because you have a plant with interveinal chlorosis does not mean you have an iron deficiency. Each of the following conditions can produce the same symptoms. Use Iron sulfate around the plant. This will add iron, in case you do have a deficiency. It will also add sulfur which might help lower your soil pH. You can also try just agricultural sulfur which will lower the pH. When the pH goes down, plants have an easier time getting at the existing iron.
- a high soil pH or Soil is alkaline
- manganese deficiency
- compacted soil
- plant competition
Nitrogen deficiency or Chlorisis
Nitrogen taken up by plants is used in the formation of amino acids which is the building block for proteins. Nitrogen is a structural component of chlorophyll. Urea, ammonium nitrate, calcium ammonium nitrate are common nitrogen-based fertilizers being used. When a plant is suffering from Nitrogen Chlorosis the older leaves of the plant will turn yellow rather than
younger leaves since younger leaves have nitrogen readily available from roots and more absorbing capacity than older leaves. Using azospirillum helps in fixing nitrogen in the soil.
Cross Origin Resource Sharing (CORS)
Cross-Origin Resource Sharing (CORS)
The browser’s same-origin policy blocks reading a resource from a different origin. This mechanism stops a malicious site from reading another site’s data. The same-origin policy tells the browser to block cross-origin requests. When you want to get a public resource from a different origin, the resource-providing server needs to tell the browser “This origin where the request is coming from can access my resource”. The browser remembers that and allows cross-origin resource sharing.
In angular when front end request origin is different the browser stops processing response from the server.
Request has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource.
Same-Origin Policy
- The same-origin policy fights one of the most common cyber-attacks out there: cross-site request forgery.
- If you have logged into FB your info would be stored in Cookie and would be tagged along when the request is made every time
- Every time you re-visit the FB tab and click around the app, you don’t have to sign in again. Instead, the API will recognize the stored session cookie upon further HTTP requests.
The only trouble is that the browser automatically includes any relevant cookies stored for a domain when another request is made to that exact domain. - Say you clicked on a particularly trick popup ad, opening evil-site.com.The evil site also has the ability to send a request to FB.com/api. Since the request is going to the FB.com domain, the browser includes the relevant cookies. Evil-site sends the session cookie, and gains authenticated access to FB. Your account has been successfully hacked with a cross-site request forgery attack.
- At this point, browser will step in and prevent the malicious code from making an API request like this. It will stop evil-site and say “Blocked by the same-origin policy.
How Browser works underhood?
- The browser checks for the request origins of the web application and the Server origins response match
- The origin is the combination of the protocol, host, and port.
For example, in https://www.FB.com, the protocol is https://, the host is www.FB.com, and the hidden port number is 5400 (the port number typically used for https).
- To conduct the same-origin check, the browser accompanies all requests with a special request header
that sends the domain information to receiving server - For example, for an app running on localhost:3000, the special request format looks like this:
Origin: http://localhost:3000
Reacting to this special request, the server sends back a response header. This header contains an Access-Control-Allow-Origin key,
to specify which origins can access the server’s resources. The key will have one of two values:One: the server can be really strict, and specify that only one origin can access it:
Access-Control-Allow-Origin: http://localhost:3000Two: the server can let the gates go wide open, and specify the wildcard value to allow all domains to access its resources:
Access-Control-Allow-Origin: * - Once the browser receives this header information back, it compares the frontend domain with the Access-Control-Allow-Origin
value from the server. If the frontend domain does not match the value, the browser raises the red flag and blocks the API
request with the CORS policy error.
The above solution works for development. How about in production.
To address such issues, the proxy is used between client and server.
Request from Client -> Proxy Server -> Server Respose from Server -> Proxy Server(appends origin) -> Client
Now what the proxy does is it appends the s Access-Control-Allow-Origin: * in the header before the response is sent to the client browser
How data is secured in browser
Symmetric Key Encryption (Private key Encryption)
- Same key is used between client and server to encryt and decrypt message
- Copy of key exist at both ends
- First time the copy of key generated should be sent securely to otherside.
- Public Key Encryption(Asymmetric Encrytion) is used to get the copy of the symmetric key for the first time
- Thoughts may arise if I could share key securely for the first time why don’t I use the same methodology but it is resource-intensive
- Advantage is The Encrytion and Decrytion is faster compared to Asymmetric Key Encryption
- Disadvantage is Key needs to be transferred for the first time and Key should be stored securely
Asymmetric Key Encryption (Public key Encryption)
- Uses Public and Private Key
- Encrypted with one key and decrypted with other key. The Client uses public key to Encryt and Server uses private key to decrypt.
- Public key would be shared and to recieve encrypted message from client by public key
- This similar to Safe(Public Key) and Key(Private Key), When you send data it would be encrypted using public key similar to
safe which doesnot needs a key to lock. The Private key in server could unlock using the key it holds.
Man-In-Middle-Attack
- Man in middle generates his own public key which is available to client
- Client used public key provided by man in middle and sends his data
- Man in middle decrypts using his private key and makes a genuine request by encryting public key to server
- To address this issue certificates were used
Certificates
- The main purpose of the digital certificate is to ensure that the public key contained in the certificate belongs to the entity to which the
certificate was issued, in other words, to verify that a person sending a message is who he or she claims to be, and to then provide the message
receiver with the means to encode a reply back to the sender. - This certificate could be cross checked and confirmed with certificate authoritiy
Certificate Authoritiy(CA)
- A CERTIFICATE AUTHORITY (CA) is a trusted entity that issues digital certificates, which are data files used to cryptographically link
an entity with a public key. Certificate authorities are a critical part of the internet’s public key infrastructure (PKI) because
they issue the Secure Sockets Layer (SSL) certificates that web browsers use to authenticate content sent from web servers. - The role of the certificate authority is to bind a public key of Server to a name which could be verified by browser to make sure the response is from genuine server
Certificate Authority validates the identity of the certificate owner. The role of CA is trust. - Certificates must contain Public Key which could be cross-checked with Certificate Authority(CA)
- CA would be mostly big companies like Symantec, google which acts as thirdparty to reassure trust.
- Self-Signed Certificate where you uses your own server and client to generate certificate. CA doesnot comes in play in Self-Signed Certificate
The above method may open door to man in middle attack - Root Certificate is something which you would get when you use Self-Signed Certificate with your custom CA. Root Certificate would be available in
all client system which access data with server
Communication over HTTPS(HTTP over Secure Socket Layer)
- SSL is web servers digital certificate offered by third party.Third party verifies the identity of the web server and its public key
- When you make a request to HTTPS website, the sites server sends a public key which is digitally signed certificate by third party or
Certificate Authority(CA) - On receiving the certificate the browser sends the Certificate with public key to third party to check whether the certificate is valid
- After verifiying the certificate the browser creates a 2 symmetric keys, one is kept for browser and other for server. The key is sent by
encrypting using webservers public key. This encryted symmetric key is sent to server - Web server uses its private key to decrypt. Now the communication happens using shared symetric key.
Typically, an applicant for a digital certificate will generate a key pair consisting of a private key and a public key, along with a certificate signing request (CSR)(Step1). A CSR is an encoded text file that includes the public key and other information that will be included in the certificate (e.g. domain name, organization, email address, etc.). Key pair and CSR generation are usually done on the server or workstation where the certificate will be installed, and the type of information included in the CSR varies depending on the validation level and intended use of the certificate. Unlike the public key, the applicant’s private key is kept secure and should never be shown to the CA (or anyone else).
After generating the CSR, the applicant sends it to a CA(Step2), who independently verifies that the information it contains is correct(Step3) and, if so, digitally signs the certificate with an issuing private key and sends it to the applicant.
When the signed certificate is presented to a third party (such as when that person accesses the certificate holder’s website), the recipient can cryptographically confirm the CA’s digital signature via the CA’s public key. Additionally, the recipient can use the certificate to confirm that signed content was sent by someone in possession of the corresponding private key, and that the information has not been altered since it was signed.
Keystore and Truststore
KeyStore and TrustStore
- Technically a KeyStore and a TrustStore are of same. They just serve different purposes based on what they contain.
- A KeyStore is simply a database or repository or a collection of Certificates or Secret Keys or key pairs. When a KeyStore contains only certificates, you call it a TrustStore.
- When you also have Private Keys associated with their corresponding Certificate chain (Key Pair or asymmetric keys), it is called a KeyStore.
- Your truststore will be in your JAVA_HOME—> JRE –>lib—> security–> cacerts
- ‘cacerts’ is a truststore. A trust store is used to authenticate peers. A keystore is used to authenticate yourself in mutual authentication
- cacerts is where Java stores public certificates of root CAs. Java uses cacerts to authenticate the servers.
Keystore is where Java stores the private keys of the clients so that it can share it to the server when the server requests client authentication. - Keystore is used to store private key and identity certificates that a specific program should present to both parties (server or client) for verification.
Truststore is used to store certificates from Certified Authorities (CA) that verify the certificate presented by the server in SSL connection. - Mutual authentication requires Keystore and Truststore whereas Server-Client authentication requires truststore to store Certificates from CA.
List the content of your keystore file
keytool -v -list -keystore .keystore
specific alias, you can also specify it in the command
keytool -list -keystore .keystore -alias foo
Importing Certificate to Truststore
keytool -import -trustcacerts -keystore $JAVA_HOME/jre/lib/security/cacerts -storepass changeit -alias Root -import -file Trustedcaroot.txt
Windows Frequently used commands
Killing Application by Command
netstat -aon | find /i "listening" netstat -aon | find /i "listening" | find "8080" taskkill /F / PID ProcessId i.e. taskkill /F / PID 189
Copy Multiple files from location using command
#command Line for /f "delims=" %i in (C:\list.txt) do (xcopy "%i" "C:\CopyFolder" /i /z /y) #batch file for /f "delims=" %%i in (C:\list.txt) do (xcopy "%%i" "C:\CopyFolder" /i /z /y)
Fixing Git Detached Head
Any checkout that is not the name of one of your branches will get you a detached HEAD.
The above scenario could be reproduced as below
- lists the remote branches
git branch -r origin/Feature/f1234 origin/master
- I want to checkout one locally, so I cut paste:
git checkout origin/Feature/f1234
- Detached HEAD state
You are in 'detached HEAD' state. [...])
Solution #1:
Do not include origin/ at the front of my branch spec when checking it out:
git checkout Feature/f1234
Solution #2:
Add -b parameter which creates a local branch from the remote
git checkout -b origin/Feature/f1234 git checkout -b Feature/f1234 #it will fall back to origin automatically
But, you have started making changes in the detached branch and you encounter that later. In such case commits will work but changes
won’t be reflected in branch in remote since it is detached. To address such an issue
- commit changes in detached local branch
- create a new branch
- checkout the branch you are already working so the connection with remote would be again established
- Now the new branch created in local contains your changes and checkout will fetch the original remote branch without changes
- Do git merge between two and push the code to remote now
git commit -m "....." git branch my-temporary-work git checkout master git merge my-temporary-work
Jasmine Errors and Warnings
Angular 6 – NullInjectorError: No provider for HttpClient in unit tests
If you don’t import HttpClientModule (or HttpClientTestingModule) there, HttpClient won’t work because Angular doesn’t know about it. It doesn’t matter that you added HttpClientModule to, say, AppModule. It needs to be in TestBed.configureTestingModule.
import { TestBed } from '@angular/core/testing'; import { HttpClientTestingModule, HttpTestingController } from '@angular/common/http/testing'; import {HttpClientModule} from '@angular/common/http'; describe('myService', () => { . . . });
Interview Questions
- How to Execute and Disable particular testgroup or testcase in jasmine?
//Disable particular test group and case xDescribe - @ignore testgroup xit - @ignore testcase //Enable particular test group and case fdescribe - Enable Particular Test Group fit - Enable Particular Test Case
- What is Arrange-Act-Assert Pattern
- Arrange – Creating Object, Initilizing and Mocking Data
- Act – Act on your unit testcase, execute necessary functionality and methods to be unit tested
- Assert – Verifying code functionality is given output as expected
- What is difference between toBe vs toEqual?
toBe compares value where as toEqual compares object.toEqual performs deep copy comparison. - Difference between spyOn – returnValue and callFake?
If we just want a return value when a service method is called then we can use any of and.callFake or and.returnValue. But incase if the service method takes
some argument and the output of service method changes based on the argument, we use callFake with output logic in callback function. Though the same could be done
using withargs in spyOn it requires redeclaring of multiple spyOn for multiple arguments. - debugElement vs nativeElement
NativeElement provides exactlty same API methods provided by Javascript for DOM Manipulation.So whatever methods was available while working in Javascript, the same methods would be available by using instance of nativeElement. debugElement, on the other hand, is a wrapper over native element with some additional methods. Using debugElement you could access rootElement which inturn calls nativeElement to get handle of DOM object. In otherwords debugElement is again going to call nativeElement to access the DOM objects. The additional advantage of using debugElement is accessing directive and component instance which is not possible in nativeElement.