Salesforce, the global player customer relationship management (CRM) market, like many other tech majors, has made various announcements around Generative AI. Arundhati Bhattacharya, Chairperson and Chief Executive Officer for Salesforce, India talked to businessline about the growth potential the company sees in Generative AI offerings. Edited excerpts:
How does Salesforce view Generative AI offerings, given the various announcements, do you think the new tech is more than just hype?
We all think Generative AI has immense possibilities, and these possibilities will become even better, as the new newer versions of generative AI come in. I do understand that there has been a lot of talk about other solutions, where the hype was far ahead of the actual solutions in the market. However, the general feeling is that that’s not going to happen with Generative AI. GenAI will have a lot of applications, and the applications will be feasible sooner rather than later.
AI has been around for a while now, but the differentiation with generative AI is the use of the large language model, which enables one to get answers by giving prompts. The ability to get data points from any mass of written matter is far bigger than what was the case in the earlier artificial intelligence models.
Also read: How do you solve a ‘wicked problem’ called Gen AI
How is Salesforce addressing concerns of enterprises with the use of proprietary data, its privacy, and security?
All of our GPT-infused products will have a trust layer that will ensure that the data you use will remain yours and no language model will be allowed to train on it. There is technology by which you can access a large language model but not allow it to use your data. We will definitely be ensuring that is how it is done because one of the biggest challenges that enterprises will face is that they can’t utilise customers’ data; they have to make sure it’s used responsibly and isn’t exploited. They, therefore, do not wish to be held liable for any of those.
There are three questions that enterprises normally ask - how much can I trust that my data won’t be misused, what will be the pricing like, and what could be the liabilities- We are conscious of the fact that these questions are out there, we don’t have all the answers, but in respect of the trust, privacy, and secrecy, we are very alive to this responsibility that we have. We are ensuring that when our products go alive, we will ensure that a sufficient amount of security and firewalls are provided to ensure that you are not losing your data to anybody else.
Also read: Generative AI: A new era of misinformation
How do you think Generative AI will affect job roles?
Every time that there has been a disruptive change, certain types of jobs have been lost, but other types of jobs have been created. Having said that, I don’t think you can replace emotional intelligence with any kind of machine. I find it very difficult to say that human emotional intelligence is going to be replaced by something that is not sentient.
What are the newer skilling initiatives that will be introduced?
We will be starting the AI skilling initiatives, the material is getting ready, and we should be having them soon. A lot of skilling initiatives will be introduced initially for our employees, then for the ecosystem - partners and customers. We have a learning platform called Trailhead, which is free to a large extent, we will be putting all of that on Trailhead for the ecosystem at large to get trained on.
Also read: Microsoft launches generative AI skill training course for free
Comments
Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.
We have migrated to a new commenting platform. If you are already a registered user of TheHindu Businessline and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.