Schema.getGlobalDescribe() is one of the costliest method in apex in term of CPU time. It gives schema of entire org. In most of orgs we have many standard and custom objects along with objects from app exchange packages.
We generally use this method to make apex code dynamic. For example if we want to build custom apex tool like workbench which can query any object and any fields selected by user.
Most of time we use this method in Look to make sure the we find right object/Field in map.
This put lot to load on backend CPUs and we may hit governor limit or page becomes very slow.
To address this issue we should use Salesforce Platform Cache & Single turn pattern in apex.
Here we will call Schema.getGlobalDescribe() only once and store this in cache.
Next time whenever we need this data again we will refer it from platform cache.
Here are answer of few Questions we were not able to talk about.
From Ybandopantpatil : What is IDP org and SP Org?
Atul: IDP stands for Identity Provider. IDP manages the identity
of users & their access rights. Some of example of IDPs are Active Directory,
Ping Identity, And Salesforce (Check the
demo again).
SP is Service provider. Service provider provide some useful
service to user and they do not manage users. SP relies on IDP to manage users.
When IDP authenticate user, SP trust on IDP and do not ask for login credentials
again.
From Jason : is each users Fed ID set manually like
you did in the demo or would that be set via automation in a production setup?
Atul: Federation Id can be set manually as well automatically.
In demo we set it up manually for simplicity. There is another aspect of provisioning
where we create user on first login.
In this case we can set Federation Id automatically. As Tejas
mentioned in his answer it can be set via data loader(or API) as well.
From PK : can you ask him to speak on
various oauth flows
Atul: Sure. There are 8 oAuth flows in salesforce. User Agent,
Web Server, JWT, Device, Asset, SAML Bearer, SAML Assertion & User Name
Password flow
From SwatiSharma : In connected app when we use Enable
Oauth setting as for current demo he used Enable SAML ?
Atul: Yes, That is right. Salesforce connected app allows you to work
with the SAML & oAuth. The choice is made based on what SP is supporting.
Majority of IDP support both SAML & oAuth so that maximum number of SP can
be used with them.
From Narayana : I am beginner of salesforce can you
suggestion what is best way to learn?
I think trailhead is best way to start. Pick a topic of
your interest and just get started with your trailhead journey.
From bharath kumar : If we enable login with google
how do we manage it for multiple users i.e how do we get consumer key and
secret for every user?
Atul: We need to setup trust only once between Salesforce and
google. Once that is done all google users can login into your system as well
as they are mapped with salesforce user. I used custom field to store google id
on user but you can do it as per your need or even use Federation Id.
From Harshal: if any new user login then how profile
assign?
Atul: We have registration handler for this. Every time user logs
in registration handler is executed. We can write our business logic in registration
handler and setup users as per our needs. i.e. Assign profiles, give permissions
and etc.
From Saksham Mahendru : Hi.. In case Google SSO.. If
any user is deactivated from G-suite, will Social SSO take case of auto
deactivating salesforce user as well. Is there any setting for that..?
Similarly for user creation.?
Atul:Once the user id deactivated in IDP (Google in our case) no
application that only trust IDP will be able allow login to deactivated user. If
you are managing Identity at SP level (this is not standard practice as it defies
the purpose of SSO) the you will need to take extra effort to sync user between
two system via using APIs.
1.Hard
work not intelligence- Even if someone is great developer and possess
excellent technical skills, we cannot surly tell whether that person will be
able to clear the exam or not. But if you are ready to do hard work then we can
surly predict that you will be able to clear any salesforce certification.
2.Read
the question properly- Take effort to read the questions and all option
properly. Many question are phrased in complex way and re-reading questions
will help you to understand the question properly. Pay attention to NOT as it changes
the meaning of question.
3.Use rejection and sorting technique-
When you are not sure about answer then you can use this technique. Here you
remove the answer which are very irreverent and then you sort then in the
context on question and then select the right option. There is no negative
marking, So make sure you answer all questions even if you don’t know the
answer.
4.Trailhead
modules & Salesforce training Videos - This is really good
training material. You should complete respective module to get through
understanding about the topics mentioned in exam guide.
5.Take notes: When you go through dumps
take notes of questions where you got wrong answers. Try those activates in
your Dev org. Think through it many times.
Problem statement: The enterprise
needs to load millions of rows form legacy Salesforce CRM to new Salesforce CRM.
While IT team continue data migration activity other business units must be
able to use system for day to day operations. For example, resolving customer issues,
responding queries, marketing activity and so on. Salesforce multitenant architecture
with verity of APIs (Bulk /SOAP/Rest) handle this very well with very few exceptions.
The particular
problem that I want to highlight global search. When we bulk load
data it affects global search indexing. Global search work asynchronously but
gives a feel of near real time. But what we have to load is billions of rows.
When we load high volumes of data if make the global search unusable as the
indexing is not complete. In our test it took whole week for global search to
work properly. CSR users were not very happy about it and they break various SAL
due to this.
Solution:
The approach is load data as slowly as possible. If you load relatively
less records (say 5k or less) the global search is available in the matter of
seconds (for me, it was less than 30 seconds). So, instead of loading data in
bulk we choose to load data in “Trickle Load” fashion. In a way it’s exactly opposite
of bulk data loading. In Bulk data loading we upload all data and periodically
keep checking the status. In Trickle load we load small chunk and wait for salesforce
to finish updating global indexes. After some time, we submit again another
chunk and this can go on and on till we finish.
If you think of this in a way, it’s just tread of between limits.
How to Trickle Load huge data?
1.Split large CSV file of data
into batches of 5k (php code is provided at the bottom)
2.Trigger job of data loading
at every 5 minutes to insert one batch file
3.The easiest way for me was
using data loader CLI to run from command prompt
Things to keep
in mind:
·With this approach we can load max 1,440,000 records per 24 hours (5k * (24* 12(batches per hour)))
·As the wait
time increase you will notice less problem
of jamming the global search indexes. As the wait time reduced you will notice more the problem of jamming the global search indexes
·Higher API
consumption can be result with this approach.
·We consider to wait for 5 minutes but you can choose
another number as per your ability to wait after jamming problem.
I cleared Salesforce Certified Integration Architecture Designer exam on 9 Sept 2016.
Here are my notes those may help you to prepare and give you some idea on what to expect.
Important Topics:
SOAP API - partner WSDL VS enterprise WSDL
REST API
Outbound messaging with all limitations and pro and corns
There was question about email service which was not mentioned in study guide
Data Loader
ETL tools pro and cons
Governor limits
Security
Other APIs metadata, chatter, stemming, bulk
Other notes
Almost 70% questions were scenario based
It took me 89 minuets to complete then exam and did not got any time for review
I know the above points so i followed do it once and do it right principal
There was only one question on Enterprise Integration Patterns - The Book and to be very frank, i just read the title of it. As its too much to read. if you are like me then may be you can just brush through it and focus on rest of important topics
Many times you will feel that all options are correct
Follow elimination technique to reduce your choices
Please don't ask for any sample questions. If you have any other questions please feel free to ask in comment and i will get back to you.