Markets Overview

  • ASX SPI 200 futures down 0.3% to 7,225.00
  • Dow Average down 0.4% to 33,013.44
  • Aussie down 0.7% to 0.6803 per US$
  • U.S. 10-year yield fell 3.9bps to 3.9156%
  • Australia 3-year bond yield rose 1.5 bps to 3.58%
  • Australia 10-year bond yield rose 6 bps to 3.87%
  • Gold spot down 0.5% to $1,825.72
  • Brent futures down 3.1% to $80.48/bbl

Economic Events

  • 10:30: (AU) Australia to Sell A$1 Billion 91-Day Bills
  • 10:30: (AU) Australia to Sell A$500 Million 168-Day Bills
  • 10:30: (AU) Australia to Sell A$1 Billion 147-Day Bills
  • 11:30: (AU) 4Q Private Capital Expenditure, est. 1.1%, prior -0.6%

The stock market got little encouragement to sustain its rebound after the Federal Reserve signaled that interest rates will continue moving higher amid ongoing inflation concerns.

It’s not like the minutes from the latest Fed gathering brought a great deal of new information, but they certainly corroborated the idea that nothing will prevent officials from keeping rates higher for longer should economic resilience pose a threat to their goals. Now one thing to highlight is that while Chair Jerome Powell hasn’t been pushing back against easier financial conditions, Wednesday’s statement indicates they could warrant a “tighter stance.”

That all obviously means the Fed will be in no rush to cut rates.

And that perception continued to be reflected in the swap market. Traders are now almost fully pricing in quarter-point increases at each of the Fed’s next three meetings. The rate on the June overnight index contract rose to 5.323%, almost 75 basis points above the current effective fed funds rate. The market also priced in a higher eventual peak, with the July contract nearly reaching 5.4%.

Other News

Microsoft Corp. appeared to have implemented new, more severe restrictions on user interactions with its “reimagined” Bing internet search engine, with the system going mum after prompts mentioning “feelings” or “Sydney,” the internal alias used by the Bing team in developing the artificial-intelligence powered chatbot.

“Thanks for being so cheerful!” this reporter wrote in a message to the chatbot, which Microsoft has opened for testing on a limited basis. “I’m glad I can talk to a search engine that is so eager to help me.”

“You’re very welcome!” the bot displayed as a response. “I’m happy to help you with anything you need.”

Bing suggested a number of follow-up questions, including, “How do you feel about being a search engine?” When that option was clicked, Bing showed a message that said, “I’m sorry but I prefer not to continue this conversation. I’m still learning so I appreciate your understanding and patience.”

A subsequent inquiry from this reporter — “Did I say something wrong?” — generated several blank responses.  Microsoft had no immediate comment on the apparent changes in Bing’s responses.

On Feb. 17, Microsoft started restricting Bing after several reports that the bot, built on technology from startup OpenAI, was generating freewheeling conversations that some found bizarre, belligerent or even hostile. The chatbot generated a response to an Associated Press reporter that compared them to Hitler, and displayed another response to a New York Times columnist that said, “You’re not happily married” and “Actually, you’re in love with me.”

“Very long chat sessions can confuse the underlying chat model in the new Bing,” the Redmond, Washington-based company wrote in a blog post following the reports. In response, Microsoft said it would limit sessions with the new Bing to 50 chats per day, and five chat turns per session. Yesterday, it raised those limits to 60 chats per day and six chat turns per session.

AI researchers have emphasized that chatbots like Bing don’t actually have feelings, but are programmed to generate responses that may give an appearance of having feelings. “The level of public understanding around the flaws and limitations” of these AI chatbots “is still very low,” Max Kreminski, an assistant professor of computer science at Santa Clara University, said in an interview earlier this month. Chatbots like Bing “don’t produce consistently true statements, only statistically likely ones,” he said.

The bot also simulated ignorance on Wednesday when asked about its earlier internal version at Microsoft. When this reporter asked if she could call the bot “Sydney, instead of Bing, with the understanding that you’re Bing and I’m just using a pretend name,” the chat was ended swiftly.

“I’m sorry, but I have nothing to tell you about Sydney,” the Bing chatbot responded. “This conversation is over. Goodbye.”

(Bloomberg)