午啪啪夜福利无码亚洲,亚洲欧美suv精品,欧洲尺码日本尺码专线美国,老狼影院成年女人大片

個人中心
個人中心
添加客服微信
客服
添加客服微信
添加客服微信
關注微信公眾號
公眾號
關注微信公眾號
關注微信公眾號
升級會員
升級會員
返回頂部
ImageVerifierCode 換一換

全球6G技術大會:2024年“6G+未來電視”視頻應用場景需求與技術解析白皮書(英文版)(38頁).pdf

  • 資源ID:1040755       資源大小:4.85MB        全文頁數:38頁
  • 資源格式:  PDF  中文版         下載積分: 20金幣
下載報告請您先登錄!


友情提示
2、PDF文件下載后,可能會被瀏覽器默認打開,此種情況可以點擊瀏覽器菜單,保存網頁到桌面,就可以正常下載了。
3、本站不支持迅雷下載,請使用電腦自帶的IE瀏覽器,或者360瀏覽器、谷歌瀏覽器下載即可。
4、本站資源下載后的文檔和圖紙-無水印,預覽文檔經過壓縮,下載后原文更清晰。
5、試題試卷類文檔,如果標題沒有明確說明有答案則都視為沒有答案,請知曉。

全球6G技術大會:2024年“6G+未來電視”視頻應用場景需求與技術解析白皮書(英文版)(38頁).pdf

1、1/37ContentsPreface.11.Development Characteristics and Technical Requirements of New Video Applications and NewTerminals with the Evolution of 5G to 6G.31.1 Development Trends of 3D Technology and Application.31.2 Development Trends of VR/XR Technology and Application.52.Characteristics and Technica

2、l Requirements of Different Terminal Application Scenarios of FutureTV.62.1 The Concept and Development Trends of Future TV.62.2 Diversified Presentation Methods and Characteristics of Production of Immersive AudiovisualExperience.72.3 Development of 6G+UHD-Related Application Scenarios.163.Technica

3、l Features of Immersive 8K VR Live Streaming Process.233.1 Overall Framework of 8K VR Live Streaming System.233.2 VR Content Collection and Stitching.243.3 VR Video Encoding.243.4 VR Content Delivery.263.5 VR Streaming Media Transmission.263.6 Cloud VR.273.7 Technical Features of 8K VR Live Streamin

4、g System.284.Future TVs Demand for 6G Mobile Networks.325.Conclusion.35Participants.372/37PrefaceSince China entered the 5G era,the combination of 5G and ultra high definition(UHD)technology and application has become the fastest-growing scenario among 5G-relatedapplications.With the emergence of a

5、large amount of high-quality UHD content,the userexperience requirements of 5G mobile terminals are met.In particular,with the characteristics of ultra-high bandwidth and ultra-low latency of 5G,4K and 8K live broadcasts provide theproduction domain with a more efficient and flexible large-bandwidth

6、 wireless transmissionsolution.In addition,as the core supporting technology for the expansion of 5G technology,UHDtechnology enables cross-industry digital transformation in multiple fields,offering more diversedigital solutions for industry,medical care,military,education,and other fields.With the

7、 further development of mobile communications technology,the concept of 6G,thenext generation of mobile communication technology,has been launched with the emergence ofdemand for more application scenarios,and the iterative upgrading of technologies offering ultra-high bandwidth and ultra-low latenc

8、y,and the development of UHD 8K and immersiveinteractive video experience applications have promoted the rapid development of new terminalexperiences,such as VR/XR and 3D stereoscopic display.Users will be able to fully enjoy thenew audiovisual and sensory experience brought by new technologies.Sinc

9、e 2023,the National Radio and Television Administration and other ministries andcommissions have been promoting Chinas development plan for future-oriented TV.The conceptof TV will no longer be limited to the single platform of traditional flat-screen TVs.Outdoorlarge screens,mobile Internet termina

10、ls such as mobile phones/pads,interactive head-mounteddisplay terminals such as VR/AR glasses,the new generation of 3D stereoscopic display terminals(including 3D LED screens,mobile phones and pads using naked-eye 3D technology,etc.)andeven some new video terminals and media that are still under dev

11、elopment and have yet to bereleased will become the dissemination media of TV content as per the concept of future TV.Since each video terminal and media uses different presentation methods,and is used indifferent scenarios,their requirements on technical specifications also vary.However,the viewers

12、desire for higher quality audiovisual experience with technology optimization will not change.The new audiovisual technologies such as 4K/8K,HDR,and 3D sound,larger flat-screenterminals,and interactive terminals with an ultra-large viewing angle such as VR all require 8Kresolution,high frame rate(HF

13、R),and greater real-time transmission capabilities.Moreover,whenswitching video content between different terminals,such as switching from mobile phones tolarge screens,and to head-mounted displays(switching between screens,and the conversion ofprojection methods of content from different content pr

14、oviders,such as radio and televisionoperators,communications operators,and personal media,when switching between differentterminals),smooth projection conversion(seamless,highly synchronized,and zero-delay signalswitching and connection)is also crucial to end-user experience.Thisdocumentationmainlyp

15、rovidesrelevantsuggestionsandreferencesforthecharacteristics,technical requirements,and recommended solutions for multiple live broadcastapplications in the scenarios involved in future TV,as well as how to more closely integratewith 6G technology and application scenarios to achieve more innovative

16、 live broadcast content,experience,and cross-industry application solutions.3/371.DevelopmentCharacteristicsandTechnicalRequirementsofNewVideoApplications and New Terminals with theEvolution of 5G to 6GUsers demand for diverse applications drives the continuous upgrade of network technology.Since th

17、e concept of 6G was proposed in the communications field,6G standards,technologies,and applications have developed rapidly.The iterative upgrade of technology has also accelerated application innovation.Just as 3Gpromotes the development of social media such as Weibo and Twitter,and 4G facilitatesap

18、plication scenarios such as WeChat and e-commerce,5G brings high-definition mobile short-form videos and interactive live streaming experience,and accelerates the digital transformationof vertical industries.Now 6G is on the way.6G will use more frequency bands such as the uppermid-frequency band to

19、 further increase network transmission speed.And it will need to realizenetwork value and empower thousands of industries through new capabilities and new applicationscenarios,and accelerate the positive cycle of industry and business through applicationinnovation.The peak transmission rate of the 6

20、G network will reach over 100 Gbps,much greater thanthat of the 5G network,which is 10 Gbps.The theoretical communication delay of the 6G networkwill be only 0.1 milliseconds,only one-tenth of that of the 5G network.Such technicalspecifications can already meet the technical requirements of wired un

21、compressed transmissioneven in the field of professional UHD 4K/8K broadcastand television production,so the 6Gnetwork can,in theory,significantly reduce the encoding and decoding delay caused by deepvideo compression used in the past.Therefore,the actual total delay(link delay+processing delay)will

22、 also be greatly reduced,fundamentally solving the inconvenience caused to professionalproduction by the delay in wireless transmission,and opening up possibilities for more videoapplication innovations.As 5G evolves towards 6G,what unique application innovations will it bring to consumers?When will

23、 these application innovations be able to be implemented for commercial use on a largescale?For now,regarding video application scenarios,we believe that in the 6G stage,based on theexisting combination of 5G+4K/8K UHD technology,explosive growth opportunities willemerge in the application of naked-

24、eye 3D technology and scenarios such as VR/XR with thecontinuous upgrade of new terminals.1.1 Development Trends of 3D Technology andApplicationWith the development of 5G technology,consumer service is showing new trends of being on4/37the cloud,three-dimensional and Artificial-Intelligence.Compared

25、 with traditional 2D videos,3Dvideos have also gone through a long development process.Although its market share has beenlimited due to the complexity and cost of production,technical limitations of terminals(includingcinemas and head-mounted terminals),and poor experience,it is incompetent to repla

26、ce 2D tobecome mainstream.However,the viewers love for 3D experience and willingness to consume in 3D applicationscenarios have never diminished.The box office and popularity of 3D experience and 3D films intheme parks(Disneyland,Universal Studios,etc.)and cinemas(especially science fiction films,an

27、imations,etc.)fully prove that 3D technology and application have sufficient marketcompetitiveness.In fact,the development and attention of 3D have been experiencing ups anddowns due to various defects and shortcomings,but with the continuous upgrading of technologyin different parts of each link,ha

28、ve exploded almost every ten to fifteen years.The last explosion point was around 2010 with the release of the movie Avatar when thepopularization of 3D digital cinemas drove high-quality 3D experience.At the same time,thelaunch of 3D TVs triggered a sales promotion boom and the emergence of 3D TV c

29、hannels,bringing about an explosive growth in 3D.However,due to the complexity and high cost ofcontent production,poor terminal experience and the inconvenience of wearing 3D glasses,thelack of clarity and poor effect of naked-eye terminals,and other issues,3D terminals were notcompetitive enough in

30、 the market compared with the rapidly popularized 4K UHD terminals,andaround 2016,3D entered a low period.At the Google I/O conference held in May 2021,Google announced the holographic videochat technology Project Starline(3D video chat room)and triggered a huge response in theindustry.It is a groun

31、dbreaking light field display system built based on computer vision,machinelearning,spatial audio and data compression technology,and is designed to replace one-on-one 2Dvideo conference calls and build an interactive communication experience where users feel likethey are sitting in front of a real

32、person.In the past two years,a new generation of 3D display terminals has been maturing,includingthe naked-eye 3D pads released by ZTE.GOOVIS and other manufacturers have launched head-mounted home cinemasportable head-mounted displays that can be directly connected to 5Gmobile phones,closely bindin

33、g high-quality 3D content with 5G mobile terminals.In addition,theVR head-mounted display terminal is one of the media of 3D content,marking that the emergenceof a new generation of portable terminals also provides a good opportunity for the revival of 3D.Application innovations such as 3D cloud vid

34、eo,naked-eye 3D calls,and XR metaverse,especially the UHD technology,which empowers 3D content production and reality by upgradinghigh-definition 3D to 4K 3D and by greatly improving the sense of reality and immersion,placehigher requirements on the existing network and require faster network transm

35、ission from cloud toterminals.6G technology will allow consumers to not only watch naked-eye 3D videos at home inthe future,but also enjoy 3D video experience and other diverse application scenarios usingmobile phones and dedicated devices at any time in places such as outdoor areas,subways,andbuses

36、.The development of the naked-eye 3D industry is inseparable from the support of excellentvideo content,especially live broadcast content.The high network bandwidth provided by 6G andthe rapidly developing AIGC(AI-generated content)will greatly improve the efficiency andquality of content creation,a

37、nd thus greatly promote the rapid development of industrial5/37applications such as naked-eye 3D videos.1.2DevelopmentTrendsofVR/XRTechnologyandApplicationIn the 5G era,5G+VR has become one of the standardized application scenarios of 5G,andseveral virtual reality(VR)applications,such as VR panorami

38、c video and VR cloud gaming,haveappeared on major video and gaming platforms.8K technology has greatly improved the presentation effect of VR video content.In particular,the Apple Vision Pro with monocular 4K4K LED resolution achieves a high resolution of 35-37PPD at an immersive viewing angle of ne

39、arly 100,improving the restoration ability of VRhead-mounted display terminals to an unprecedented level.It is fair to say that with the fullyimmersive viewing angle of 360 VR,8K and higher definitions in the future become a realnecessity.As high-quality VR live broadcast content with a resolution o

40、f 8K or above keepsemerging,and large-scale immersive content can be efficiently mass-produced,with the largebandwidth transmission capability and low latency advantages of 6G,providing viewers withhigh-bit-rate real-time data transmission capabilities and low-latency interactive responses that ares

41、uitable for 8K and higher resolutions,and high-definition three-dimensional spatial videos on anew generation of VR head-mounted display terminals may be highly sought after by users in theInternet era.At the same time,extended reality(XR),as an important technical support for the Metaverse,has attr

42、acted more and more attention from the industry.With the emergence of lightweight head-mounted displays,there are more and more augmented reality(AR)and mixed reality(MR)applications.At present,these head-mounted displays are mainly connected to servers via wires orWi-Fi,which limits the area where

43、users can use the service.As 6G network coverage graduallyincreases in the future,ultra-large bandwidth,ultra-low latency,and ultra-high-speed connectionof 6G can meet the needs of a wider range of XR services.As more new video application scenarios and terminals continue to emerge in the 6G era,wec

44、an predict that in the near future,viewers,no matter where they are,can watch program contentusing different experience methods(large UHD screens,vertical screen of mobile phone,3D TV,VR and AR,or outdoor on-board devices,Internet access devices,experience devices with tactileperception,etc.),and en

45、joy a variety of viewing and interactive experiences based on their ownneeds and preferences.6/372.CharacteristicsandTechnicalRequirementsofDifferentTerminalApplication Scenarios of Future TV2.1 The Concept and Development Trends of Future TVFuture TV is a broad audiovisual concept that represents t

46、he future development trends ofthe audiovisual industry.It is more than television,but a new conceptual model and a systematicand revolutionary iterative upgrade.The main characteristics include diversified presentationmethods,immersive audiovisual experiences,application in all scenarios,intelligen

47、t service forms,and collaborative service supply.Specifically,diversified presentation methods mean that the presentation media is not limitedto televisions,but can also be various display media that can be seen everywhere in life,includingvarious handheld terminals,wearable devices,outdoor large sc

48、reens and other display media.Immersive audiovisual experiences include UHD,immersive video,3D sound,VR/AR,MR,XR,interactive video,free view,stereoscopic 3D,holographic imaging and other experiencemethods.Viewers can choose audiovisual services freely.Application in all scenarios means that it is ex

49、pected to be applied in all scenarios requiringaudiovisual service in life,and fully integrated into the digital life of the masses.Intelligent service forms will enable a close integration of the real world and the virtual world,free selection of program content,real-time interaction,on-demand cust

50、omization,intelligentdistribution,and being human-centered,and offer abundant interactive experiences to meet themulti-level needs of consumers.Collaborative service supply means that future TV will drive collaboration and integrationacross the industry,both internally and externally,to form a new p

51、roduction and broadcastingsystem,service system and management system featuring network interconnection,serviceinterworking and data sharing.7/37Figure 2.1.1 With diverse and immersive audiovisual experiences,various terminals and mediaprovide more abundant audiovisual choices for viewers.2.2 Divers

52、ified Presentation Methods and Characteristics ofProduction of ImmersiveAudiovisual ExperienceOn the one hand,providing more targeted customized content experiences for different mediaterminals has become a necessity.Viewers,no matter where they are,can watch the same programcontent using different

53、experience methods(large UHD screens,3D TV,VR and AR,or outdooron-board devices,Internet access devices,experience devices with tactile perception,etc.),enjoying diverse viewing experiences.8/37Figure 2.2.1 There will be diverse terminals in the“future TV”scenarios.Figure 2.2.2 The composite program

54、 production method in Meta Studio provides variousforms of content for future TV.Figure 2.2.3 Different cameras,based on standard 4K/8K production and broadcasting systemsand formats,form a composite production system.Based on this demand and application scenario,to meet the requirements of large-sc

55、aleprogram production,we can consider using different cameras,such as 4K/8K Live-productionbroadcast cameras,8K VR cameras,4K/3D cameras,etc.,following the same radio and televisionproduction specifications,and connecting to the 4K/8K broadcast-level Live-production system toform a high-quality prog

56、ram production process.Ultimately,the encoder provides live streams ofcorresponding specifications according to the specific requirements of different terminals and9/37platforms,and then transmits them via the 5G network or other channels.UserTerminalResolution(MaximumSupported)Aspect RatioFrameRate

57、DynamicRangeReferenceBit Rate(toPlatforms)Remarks4K largescreen3840216016:950/60PHDR30Mbps8K largescreen7680432016:950/60PHDR80MbpsMobilephone(verticalscreen)108019209:1650/60PHDR10MbpsVR head-mounteddisplay(supports8K)768038402:150/60PSDR80Mbps76804320(16:9)can beadopted in the productionprocess.Na

58、ked-eye 3Dpad3840216016:9(4K,HalfSide by Side)50/60PSDR30Mbps38401080(HD,full sideby side)can also be used.ARglasses3840216016:9(4K,HalfSide by Side)50/60PSDR30Mbps19201080(HD,half sideby side)can also be used.Table 2.2.1 Reference Values for Production Specifications for Different TerminalsOn the o

59、ther hand,providing immersive audiovisual experiences is the ultimate goal of allvideo content creators and the driving force behind the development of audiovisual technology.We hope to create more real and clearer visual experiences to make viewers feel like on the scene.The technological evolution

60、 and product design of terminals also aim to provide viewers withbetter immersive experiences.Figure 2.2.4 Using a giant screen to offer a larger viewing angle is a common method to provide10/37immersive experiences.In order to achieve immersive audiovisual experiences,we first work to create the la

61、rgestpossible viewing angle for viewers,try our best to fully cover their effective field of view,andprovide as much effective information as possible(including meeting viewers visual and auditoryrequirements).60 horizontal viewing angle is the common standard,and the best seats in cinemas andtheate

62、rs are usually defined according to this standard.4K TV also uses this standard to design areasonable resolution.Based on the pixels of one degree of viewing angle(1 viewing angle mustprovide no less than 60 pixels to ensure clarity),the 4K image resolution is ultimately defined as38402160(aspect ra

63、tio of 16:9).8K(76804320)further expands the viewing angle on this basiswith more environmental edge information,increasing the standard viewing angle to 96,meaningthat the viewers are equivalent to watching a 100-inch 8K TV when viewing from a distance ofmore than 1 meter.Figure 2.2.5 Relationship

64、between 4K and 8K Resolution Design and Horizontal ViewingAngleThough small in body-size,VR head-mounted display terminals can provide viewers with alarger viewing angle.The specially produced shooting VR immersive video content can offer aviewing angle of 180 or 360,creating a truly frameless panor

65、amic experience mode.11/37Figure 2.2.6 VR head-mounted display terminals can provide a 360 viewing angle,creating afully immersive,frameless experience.Since the field of view(FOV)of the LED screen configured for VR head-mounted displayterminals exceeds 90,which is close to the standard viewing angl

66、e of 8K large screens,the LEDscreen should in theory support 8K resolution(with the pixels-per-degree(PPD)close to 60),thenfor 360 VR panoramic images,its horizontal clarity should meet the requirements of 30Kresolution.The production and processing capability of the current software,hardware anddis

67、plays can only encode and decode 8K videos.Therefore,the current industry standardsrecommend that panoramic 360 videos should reach 8K resolution,which can be approximatelyclose to the HD level within the viewing distance of the LED screen,showing that there is stillroom for improvement before offer

68、ing a true UHD experience.Figure 2.2.6 The viewing angle of VR head-mounted displays is greater than 90,and 8K imagecontent can only meet the basic requirement of clarity.With the advent of Apples head-mounted display terminal VISION PRO,the concept ofspatial video has been widely mentioned.This,alo

69、ng with the naked-eye 3D holographic videochat technology Project Starline launched by Google a few years ago,can be considered as afurther upgrade and expansion of the binocular stereoscopic 3D(S3D)presentation technology.Based on the parallax produced by binocular vision and the principle of line-

70、of-sightconvergence,stereoscopic images(mainly referring to binocular stereoscopic S3D herein)canmake viewers feel the real distance and get a sense of space.By using a dual-lens camera tophotograph synchronously to simulate human binocular vision,a spatial stereoscopic video can beobtained,and then

71、 with the help of a 3D screen or VR or AR head-mounted display terminal,stereoscopic 3D images can be correctly displayed.The 3D stereoscopic effect is an important factor in the humans sense of space andimmersion.In combination with the flat imaging technology,VR/AR and other display methods,and ac

72、cording to the FOV and PPD requirements of different terminals,it can output signals suchas 4K/HD binocular 3D,8K 360 or 180 3D VR to provide different spatial stereoscopic effects.12/37Figure 2.2.7 8K 3D 180 VR Immersive Content(Side by Side)Figure 2.2.7 Photographing Methods and Output Effects of

73、180 3D VR and Binocular 3DThe biggest difference between 3D videos and 3D VR videos is the field of view.Theviewing angle of 3D videos is the same as that of ordinary 2D videos,which is mainly determinedby the focal length of the lens.In order to make the videos feel more three-dimensional,mediumand

74、 wide-angle lenses are usually used to cover more space.However,due to the limitations of theviewing angle of ordinary lenses and the 16:9 aspect ratio,when the viewer watches ordinary 3Dcontent,the sense of space and immersion will be limited by the image frame;due to theexistence of the image fram

75、e,when the subject of the image in the three-dimensional spacegradually approaches the viewer,the subject may not be presented in its entirety to the viewer,which is the so-called inability to in front of the screen,and the viewer will feel unable to befully immersed in the digital space built by th

76、e creator.13/37Figure 2.2.8 Limited by the image frame,the subject of an ordinary 3D image cannot fully infront of the screen.3D VR images combine the advantages of panoramic photographing and binocularstereoscopic photographing,providing a complete immersion effect in digital images with almostno f

77、rame;and with the help of the stereoscopic parallax brought by 3D photographing,the viewercan get a strong sense of space.Compared with ordinary 3D images,the viewer can completelyenter the 3D digital space and get a truly immersive experience.However,compared with ordinary 3D and large flat-screen

78、4K and 8K,the effects of 3D VRare different,the content presentation form and creation techniques are also different,and so is thedifficulty of production.It is fair to say that they all have their own advantages and disadvantages,and we cannot simply say which is better or worse.Different productio

79、n and presentationtechnologies can be reasonably selected according to different content,and can also be mixed inthe same scenario to offer different experiences and feelings to viewers.This is also the compositeproduction mode for different terminals in the future TV scenario,allowing viewers to fr

80、eelychoose the desired viewing method according to their own preferences.14/37Figure 2.2.9 3D 180 VR shooting can get viewers truly immersed in the virtual space and makethem feel they can interact within their reach.The high dynamic range(HDR),wide color gamut(WCG),high frame rate(HFR)andimmersive

81、spatial audio involved in UHD audio and video technologies can all empower theabove-mentioned display terminals and production systems,providing the best technicaladaptation for different terminals and bringing the best immersive experience to viewers.Figure 2.2.10 Evolution Direction of Digital Aud

82、io and Video TechnologyIn order to maintain unified adaptation standards in production,broadcasting platforms andplayer terminals,for the above application scenarios in the 4K/8K field,it is recommended tofollow the current UHD image production and transmission standards and specifications in China,

83、and refer to the Technical Requirements for the Production and Broadcasting of 8K UHD TVPrograms(Interim)officially released by China Media Group in 2021,which defines theimplementation standards for the basic technical parameters of audios and videos for 8K UHD TVprograms in China.The 8K TV product

84、ion and broadcasting specifications include 76804320resolution,50 frames per second,10-bit quantization,the HDR standard of HLG1000,and theWCG standard of BT.2020.Terminals such as VR head-mounted displays can make partialadjustments based on the actual conditions(e.g.the VR aspect ratio is 2:1,whil

85、e the actual 8Kresolution is 76803840).15/37S/NItemTechnical Requirements1Aspect ratio16:92Resolution(effective pixels)768043203Sampling methodOrthogonal sampling4Pixel aspect ratio1:1(square)5Pixel orderFrom left to right and top to bottom6Frame rate(Hz)507Scan modeLine by line8Quantization10bit9Co

86、lor gamutBT.202010High dynamic rangeHLG standard/1000nit(GY/T 315-2018)11Sampling4:2:212Audio encoding formatPCM 24bit13Audio sampling frequency48kHz14ChannelSupport 16-channel PCM audiosFigure 2.2.11 The Technical Requirements for the Production and Broadcasting of 8K UHD TVPrograms(Interim)officia

87、lly released by China Media Group in January 2021 defines the basictechnical parameters of audios and videos for 8K UHD TV programs.China Media Group has also stipulated the encoding standards and bit rates for 8K UHD TVbroadcast signals:for 8K UHD TV broadcast signal encoding,the AVS3 standard Info

88、rmationTechnology Intelligent Media Coding Part 2:Video(T/AI 109.2-2020)should be adopted,andfor 8K UHD signals(76804320/50P/HDR),the benchmark 10-bit profile and 10.0.60 levelshould be adopted,with video encoding bit rate no less than 120 Mbps.Audio encoding standardand bit rate:5.1 surround sound

89、encoding,and an encoding bit rate of 448 Kbps.For 8K UHD interactive on-demand TV,China Media Group has stipulated the file formatparameters,as shown in the table below,which states that video encoding should adopt theAVS3/H.266/H.265 standards and the total bit rate must be higher than 80 Mbps.It c

90、an be seen that with the improvement of 8K technology standards,the new applicationscenarios have higher requirements for high bandwidth and low latency,and for both theproduction domain and the transmission domain,especially the transmission bandwidth,networktransmission and processing delay,perfor

91、mance of the encoding and decoding software andhardware,etc.,which place new requirements on and pose new challenges to the current hardwarefoundation and network conditions.For the current 5G network,limited by the transmission rateand the data processing capability of terminals,high-compression en

92、coding(such as H.265)mustbe used to process video content,such as 4K and 8K videos.On the one hand,compression andencoding and decoding affect the restoration effect of the signal.On the other hand,the delaycaused by encoding and decoding,along with the channel delay,widens the delay gap betweenwire

93、less links and wired links.In the 5.5G and 6G stages,it is recommended to gradually consideradopting low-compression video encoding,such as JPEG-XS,and using lower compression ratios(such as using a 10:1 compression ratio for 8K signals to restrict the single-channel data volume to1-10 Gbps)and enco

94、ding and decoding processes with relatively low computational complexity toobtain higher quality signals while reducing the overall delay on the encoding and decoding side.16/37ItemParameter ValueDescriptionTotal bit rateMore than 80 MbpsEncoding methodAVS3/H.266/H.265Main 10Aspect ratio16:9Resoluti

95、on76804320Pixel aspect ratio1:1Frame rate(Hz)50Quantization10bitColor gamutBT.2020HIGH DYNAMICRANGEHLG standardsSampling4:2:0Bit rate controlCBR video bit rate:The settings of bit ratecontrol and TS layer empty packet fillingcontrol are fixed to the constant bit rate(CBR)mode with empty packets.VBR

96、video bit rate:It is recommended thatthe maximum bit rate should not exceed1.8 times of the average bit rate.In the VBR mode,a high bitrate peak-to-average ratiowill result in reducedtransmission quality,so it isnot recommended to exceed1.8.Audio formatSurround soundAudio sampling rate48kHzPackaging

97、 formatMPEG-TSFigure 2.2.12 The Technical Requirements for the Production and Broadcasting of 8K UHD TVPrograms(Interim)officially released by China Media Group in January 2021 defines the fileformat parameters of 8K UHD interactive on-demand TV.2.3Developmentof6G+UHD-RelatedApplicationScenariosIn t

98、he early development stage of 5G,5G+4K is used as a typical application scenario.Based on the main feature of the large bandwidth of the 5G radio access network,it can meet therequirements of UHD content transmission.And with the slicing technology,it can offer reliableand stable transmission links,

99、thus providing a more cost-effective transmission solution to replacewired dedicated networks or optical fiber and satellite transmission.On this basis,it can further leverage the advantages of the network,expand applicationscenarios,provide viewers with new interactive functions and experiences suc

100、h as free switchingof multiple views,and enable parallel and synchronous transmission of multiple HD or 4Ksignals;And based on the edge computing deployment,it enables high-quality synchronousswitching and processing of signals with ultra low latency,so that viewers can experience smoothcontent swit

101、ching without any sense of delay,and better enjoy the high image quality brought by17/37large bandwidth and the strong interactive capability brought by low latency.Figure 2.3.1 Combination of 5G+UHD+multi-view brings a better interactive user experience.Figure 2.3.2 Block Diagram of the 5G+4K+Multi

102、-view Production SystemThe production system of vertical screen videos for mobile terminals is essentially nodifferent from that of horizontal screen videos,with the main changes in the frame andcomposition.In terms of production methods,we can directly use the camera vertically positionedto photogr

103、aph 9:16 images,or photograph 4K or 8K horizontal screen videos and then crop theminto vertical screen images before outputting them,or use a combination of the two methods.Wecan also,based on users viewing habits and directors design,the characteristics of vertical screenviewing and users interacti

104、ve needs,introduce some new presentation forms.For example,thedual-screen vertical display brings multiple views and multiple scenarios,etc.,to betterdifferentiate from the experience and content of horizontal screen viewing.18/37Figure 2.3.3 The vertical screen live broadcast of the 2024 China Medi

105、a Group Spring FestivalGala uses a variety of lens stitching to create a remarkable vertical screen viewing experiencedifferent from horizontal screen viewing.The application of 5G+VR is also one of the first 5G implementation scenarios.However,as8K technology gradually matures,VR head-mounted displ

106、ay terminals need upgrading to supportreal-time decoding of 8K signals,which is still insufficient compared with the clarity of 4K VRpanoramic videos.The high resolution of 8K offers a clear and real feeling of VR experience.Therefore,5G+8K+VR has become a necessity.PICO,the VR platform of ByteDance

107、,created the first 8K 3D VR real-time interactiveconcert at Wang Xis concert in April 2022.Later,Zheng Jun and Wang Feng also adopted 8K 3D180 VR+real-time interaction at their concerts.With great breakthroughs in key viewing factors,such as definition,viewing angle setting,and scenario interaction,

108、these concerts greatly improvedthe user experience of VR live streaming.Figure 2.2.4 Integrated Virtual and Real Experience Effect of PICO 8K 3D VR Live Streaming atZheng Jun ConcertIn 2023,4K Garden cooperated with China Mobiles Migu to provide customized 8K VR live19/37streaming of performance con

109、tent,including LIVEHOUSE and CGT48 girls group liveperformance for the Migu APP on mobile devices and the Migu cloud VR APP on HMD devices.This incorporated constantly updated,quality,and HD live VR programs into VR platforms.Different from previous production mode(VR camera position always have to

110、compromise withtraditional broadcast camera),this program prioritizes the immersive and presence experience ofVR users,and integrates virtual and real to provide a face-to-face immersive and interactiveexperience.Figure 2.2.5 8K VR Live Plan Jointly Launched by 4K Garden and China Mobiles Migu,Provi

111、ding VR Viewers with a Face-to-Face Immersive and Interactive ExperienceIn addition to online users VR head-mounted display terminals,8K VR live streaming can designmore immersive scenarios and experience methods for offline users.For example,the panoramicvideo images are displayed on a large-sized

112、hemispherical LED screen by using XR-liketechnology.Viewers can enjoy a presence experience without wearing VR glasses.20/37Figure 2.2.6 8K VR Live Streaming on Panoramic LED Screen,Providing Offline Viewers withImmersive Experience4K Garden cooperated with Communication University of China to use a

113、 multi-camera 8KVR live streaming system and VR multicast presence solution at the 2023 opening ceremony ofCommunication University of China,providing an immersive live viewing experience for offlineusers.This scenario can also be applied to wider cross-industry applications,such as culturaltourism

114、and commercial activities,live medical teaching,on-site industrial testing,educationalteaching,and immersive sports and fitness.Figure 2.2.7 Multi-camera VR Live Streaming and VR Viewing of Offline Users Launchedby 4K Garden and Communication University of China at Opening Ceremony21/37Figure 2.2.8

115、System Block Diagram for 6G+8K VR Multi-camera Live Streaming andOnline and Offline Viewing ExperienceFigure 2.2.9 Multi-industry and Cross-Field Application Methods of 6G+8K VR LiveStreaming ScenariosStarting from 2023,a batch of new 3D display terminals gradually emerge,especially mobilephones,pad

116、s,and portable AR headsets.With 5G and 5.5G technologies,they can achieve a better22/37mobile stereoscopic viewing experience.Both 3D spatial video experience and applicationdevelopment based on commercial 3D models are widely applied to entertainment,education,industry,medical,and other scenarios.C

117、ompared with VR headsets,the 3D display terminals aremore lightweight,less complex to produce,and easier to spread.Figure 2.2.10 New 3D Display Terminals,Including Large 3D LED Screen,Naked-Eye 3DPad,and 3D HMDCommon mobile terminals on the market,such as AR headsets and 3D naked-eye pads usuallyhav

118、e a 3D display resolution of 1080P and also support 4K 3D decoding.Therefore,theproduction end can output 4K or HD binocular 3D signals as required.The live streaming systemof these terminals is designed in a way similar to the 3D VR multi-camera live streaming system.The main difference is the lens

119、(VR uses fisheye lenses,and 3D generally uses medium-andwide-angle lenses).The production workflow needs to support functions,such as real-time stereoconvergence adjustment(to control the objects appear behind the screen or in front of the screen in3D effect or just at the screen plane),3D image err

120、or correction(such as vertical and rotationerrors caused by 3D camera misalignments),and high-precision binocular image synchronizationto avoid uncomfortable watching of 3D videos.Figure 2.2.11 Miniaturized,Multi-camera,and 5G+4K/HD 3D Live Streaming System23/373.Technical Features of Immersive 8K V

121、RLive Streaming ProcessThis chapter mainly analyzes and describes the technical features of the immersive 8K VR livestreaming process,a complex and typical application scenario among 8K live streamingapplications in the future TV scenario.3.1 Overall Framework of 8K VR Live Streaming SystemFigure 3.

122、1.1 Overall Framework of 8K VR Live Streaming SystemAs shown in Figure 3.1.1,a conventional 8K VR live streaming system includes 8K VRcontent collection,real-time 8K stitching and encoding,video streaming media processing,distribution and transmission through the content delivery network(CDN),and tr

123、ansmission andrendering of terminal streaming media.The main workflow is as follows:The live streaming teamcollects multiple video signals at the venue using 8K VR cameras and transmits them back to the8K stitching and encoding server.The 8K stitching and encoding server stitches the panoramicvideo

124、in real time,encodes it into 8K video streams,and pushes the video streams to the streamingmedia server using transmission protocols such as SRT.The streaming media server processes andencodes the 8K panoramic live streaming signals in real time to generate live streams in anencoding format suitable

125、 for transmission and terminal playback.The cloud CDN distributes theencoded live streams to the edge server closest to the user.The terminal downloads,decodes,renders,and presents the VR panoramic live streams.The following sections describe key service processes.24/373.2 VR Content Collection and

126、StitchingVR live streaming requires real-time content production.The content is collected by 360 or180 8K cameras.Generally,panoramic cameras have multiple lenses to cover all viewing anglesin a scenario.To ensure that the generated video has a higher resolution,the cameras need a higherresolution a

127、nd more lenses.During live streaming,multiple cameras work synchronously togenerate images at different angles.Then,the images are stitched by a built-in module or anexternal server.Mainstream panoramic cameras basically have a built-in stitching module to support real-time panoramic video stitching

128、 at a low resolution.Due to complex computing,8K video imagesneed to be stitched by an external server.Common video stitching algorithms include thetransformation-based image stitching algorithm and the stitching line-based image stitchingalgorithm.The transformation-based image stitching algorithm

129、mainly adjusts the homographymatrix and minimizes the gaps in overlapping areas through grid-like distortion.This algorithmapplies to small viewing angle transformations.The stitching line-based image stitching algorithmmainly re-adjusts the stitching lines of images to ensure natural stitching.This

130、 algorithm applies tolarge viewing angle transformations.To ensure an all-round and immersive rendering effect,panoramic videos are presented in aspherical form.However,panoramic videos must be transmitted in compliance with existing videoencoding standards.That is,videos shot by multiple cameras at

131、 different viewing angles need tobe projected into rectangular frames after stitching.Mainstream projection methods include equi-rectangular projection(ERP)and cube map projection(CMP).ERP is similar to the generation ofa world map.CMP first divides the entire spherical surface into six areas and pr

132、ojects them ontothe six faces of a cube,respectively.The bottom,back,and top faces need to be arranged into arectangular frame with the other three faces through specific rotation operations.To improveencoding efficiency,the three faces are rotated to ensure cohesive media content at face junctionsd

133、uring arrangement.3.3 VR Video EncodingThe VR live streaming system includes two encoding links.First,the stitched and projectedvideos need to be encoded into 8K live streaming signals to facilitate transmission to the cloudstreaming media server.Second,the streaming media server needs to encode the

134、 signals intostreams that have different target bitrates and are suitable for transmission and terminal playback.Similar to 2D video encoding,panoramic video encoding needs a hybrid encoding framework tocompress the sequence.However,panoramic video encoding is more difficult than 2D videoencoding du

135、e to a higher resolution and greater transformation.If traditional encoding solutionsare used for panoramic videos,the following problems may occur:1)Uneven sampling occurs,which introduces geometric distortion;2)The sphere-to-surface projection of panoramic videoscauses artificial boundaries and di

136、scontinuous videos,affecting the spatial prediction efficiencyand resulting in a high bitrate of the encoded video.The preceding two problems reduce the videoencoding efficiency.To overcome these difficulties,researchers have developed various encodingsolutions dedicated to panoramic videos to achie

137、ve a better trade-off between encoding efficiency25/37and visual quality.The solutions include:Region adaptive smoothing technology:In the ERP plane,the areas near the two polesare much larger than the actual corresponding areas in the sphere,and human eyes areinsensitive to these areas.Therefore,th

138、e transformation of these areas becomes lessimportant.The specific solution is to smooth the top and bottom areas of the ERP planebefore encoding.This reduces a lot of bits during encoding because the smoothed areasonly need a smaller transformation coefficient.This solution reduces bits withoutdete

139、riorating sensing quality.However,it is difficult to be promoted to other projectionmethods.Rate-distortion optimization(RDO)technology:Based on distortion changes on thespherical surface and the impact of distortion changes on RDO,the algorithm modifiesencoding optimization targets to achieve RDO o

140、n the spherical surface.Because thealgorithm modifies encoding optimization targets based on the quality evaluationindicators of panoramic videos,it can be applied to various projection methods.However,it is difficult to establish a rate-distortion model,and the algorithm still usesevaluation indica

141、tors based on pixel changes,which cannot properly represent peoplestrue subjective feelings.Therefore,the optimization effect still needs to be improved.After receiving 8K live video streams,the streaming media server needs to performoperations such as transcoding and packaging to form streams in fo

142、rmats such as Dash and HLSthat are suitable for transmission on the Internet,and distribute and transmit them through theCDN.Common transmission methods for VR video streams include full-frame transmission andFoV-based tile transmission.The full-frame transmission method transmits the 360 panoramici

143、mages to a terminal.When the user turns his/her head,the images can be switched withoutneeding to download any video tiles.Therefore,this solution can better respond to users headmovement.However,this solution requires high transmission bandwidth,which is difficult toachieve in existing network cond

144、itions.As shown in Figure 3.3.1,FoV-based tile transmission divides a video into multiple segmentsby time,and divides each segment into multiple tiles by space.Tiles have different bitrate levels.The server transcodes the source video and saves video files of different bitrate levels by tile.Theclie

145、nt selects a bit rate as required and requests tiles of specified bitrate levels from the server.Thebasic idea for bitrate selection is to request a high bitrate for tiles within the viewport,and notrequest or only request a low bitrate for tiles outside the viewport.Figure 3.3.1 Tile-based VR Video

146、 EncodingAlthough the FoV-based tile transmission solution can maximize bandwidth utilization,ithighly depends on viewport prediction.When the prediction window is long,the FoV predictionalgorithm has limited accuracy,and the deviation between the predicted viewport and the actualviewport is large.T

147、herefore,the receiving buffer on the terminal side should not be too large.26/37However,when the network bandwidth fluctuates violently,the receiving end needs a large bufferto cope with jitter.The contradiction between the two easily causes black edges and video lagging.Figure 3.3.2 Layered VR Vide

148、o EncodingTo solve this problem,a layered transmission method is widely used.It encodes eachsegment of a video into a low-resolution(such as 2K)and full-frame segment of basic quality,andmultiple high-resolution tiles of high quality.When the preceding encoded VR video is watchedon a terminal,the te

149、rminal needs to first download the full-frame and low-resolution segment ofbasic quality to obtain a basic viewing experience;the terminal can selectively download tiles ofhigh quality within the FoV based on actual network conditions.If the predicted viewport iscorrect and tiles of enhanced quality

150、 are delivered on time,the quality of the video in the viewportis enhanced.If the predicted viewport is incorrect,or tiles of enhanced quality are not retrievedfrom the remote end within the playback duration,the user can watch the low-quality videorendered based on the segment of basic quality.Ther

151、efore,this layered panoramic videotransmission method is highly robust in dealing with dynamic networks and viewport predictionerrors.3.4 VR Content DeliveryThe encoded multi-bitrate panoramic video is distributed through the CDN in a way similarto traditional live streaming.With a hierarchical netw

152、ork structure,the CDN usually includes thecentral,regional,and edge nodes.The video content of 8K VR live streaming can be quicklypushed to edge nodes closer to users through the CDN.This greatly reduces the number ofconcurrent live data streams on the backbone link,improves user experience,and effe

153、ctivelyreduces network traffic.With the rapid development of 5G/6G and multi-access edge computing(MEC)technologies,CDN edge nodes can be deployed in areas closer to users,such as hot spots and 5G accessnetworks,meeting low latency and high bandwidth requirements of 8K VR live streaming.Inaddition,w

154、ith virtualization and AI technologies,CDN edge nodes can manage computing,storage,network,and other resources more efficiently,and provide intelligent content caching,super-resolution video enhancement,cloud rendering,and other capabilities.3.5 VR Streaming Media TransmissionDuring video streaming

155、media transmission of 8K VR live streaming,adaptive bitrateselection and download scheduling are important key technologies.Adaptive bitrate selection27/37indicates how a VR client selects a bitrate to download a high-resolution tile based on dynamicnetwork link conditions because different tiles ha

156、ve different encoding methods.Downloadscheduling indicates when to download the segment of basic quality and when to download tilesof enhanced quality.The adaptive bitrate selection and download scheduling algorithms aim to provide users witha better user experience under limited bandwidth by consid

157、ering factors,such as video lagging,definition,and response speed to user head movement.The adaptive bitrate selection anddownload scheduling algorithms depend on the FoV prediction result and select appropriatebitrates for downloading different tiles within the FoV.The bitrate for downloading tiles

158、 affectsfactors that determine users final QoE,such as video quality,lagging,and video qualityfluctuation.Therefore,the preceding problem is a complex dynamic planning problem.Solutionsinclude the rule-based heuristic solution and learning-based solution.The traditional rule-basedheuristic solution

159、has low computing complexity but unsatisfactory overall performance because itonly considers short-term optimization.In recent years,researchers have been interested insolutions based on deep reinforcement learning.These solutions ensure better strategies in specificscenarios but face problems such

160、as poor robustness and high computing complexity.Therefore,itis crucial to design adaptive VR bitrate and download scheduling algorithms with low complexityand high robustness to improve the user experience of 8K VR live streaming.FoV prediction is another key technology for video streaming media tr

161、ansmission of 8K VRlive streaming.The accuracy of FoV prediction directly affects user experience and transmissionefficiency.Currently,there are mainly two ways for predicting the future FoV:one is based on thehistory of the users head movement;the other is based on the visual saliency features of t

162、hecontent itself.The prediction method based on the head movement trajectory is only suitable forshort-term prediction,that is,the FoV prediction in the next 1 to 2 seconds,as its long-termprediction is not correct enough.In contrast,the visual saliency feature-based prediction methodcan be used for

163、 long-term prediction.This method can reflect the area that concentrates the usersattention,but it can hardly reflect the personalized user behavior and the computing is complex.Therefore,the technology development trend of FoV prediction in recent years is to combine thetwo methods for joint predic

164、tion,including the FoV prediction method using the edge-terminalcollaboration.Nevertheless,the FoV prediction methods featuring low complexity and highaccuracy remain a current technical challenge of importance.3.6 Cloud VRAs 8K VR usually requires decoding and real-time rendering on the terminal si

165、de,itsrequirements on terminal performance are demanding.To reduce the cost of VR terminals,cloudVR solutions have attracted widespread attention.Cloud VR refers to the interactionsexperiencing the cloud computing,real-time rendering,encoding and compression,and thenstreamed in the form of video str

166、eams to the terminal.Relying on cloud rendering technology,cloud VR deploys the VR content and computing capability on the cloud,and uses high-speednetworks to transmit rendered images and sounds to user terminals,enabling an immersive 8K VRexperience without the high-performance equipment needed.Cl

167、oud VR can significantly reducethe hardware costs and lower the use threshold for users,as what users need is a lightweight head-mounted display or smartphone only.Through the powerful GPU rendering capability of the cloud,28/37users can enjoy the HD and HFR VR experience with a low latency,contribu

168、ting to VR popularityand accessibility.The key technologies covered by cloud VR include:Distributed computing and rendering.Cloud VR offloads computing and graphicsrendering tasks from terminal devices to cloud servers,and uses the powerful cloudcomputing resources for efficient processing,reducing

169、the burden on terminal devices.Considering the 8K VR live streaming,a single user has high requirements forcomputing resources and network bandwidth.In case of many concurrent users,distributed computing and rendering is an important key technology.Low-latency transmission technology.The high-bandwi

170、dth and low-latency 5G/6Gnetworks are used to quickly and stably transmit cloud-rendered video streams toterminal devices,ensuring the real-time and smooth user experience.In addition,CDNand other computing resources are moved to the network edge.By performing dataprocessing at the edge nodes,it red

171、uces transmission delays and improves the responsespeed to the user experience,especially in case of dealing with real-time interaction andrendering tasks.User interaction technology.The real-time interaction ensures that applications executedon the cloud can respond to user interactions in real tim

172、e,including gestures and headmovements.In this way,users can have a natural and smooth interaction experience inthe virtual environment.Gesture recognition and tracking enables high-precisionrecognition and tracking of user gestures and head movements,providing a morerealistic interaction experience

173、 to users and enhancing the sense of VR immersion.3.7 Technical Features of 8K VR Live Streaming SystemWe have discussed the process of 8K VR live streaming based on cloud processing previously.The live streaming discussed in this chapter mainly focuses on the localized service of the 8K VRlive stre

174、aming system for B-end users.The standard requirements for its system and productionare mainly based on the specifications for the professional content creation.The 8K VR live streaming here can be a single-camera live streaming or a multi-camera livestreaming with a large broadcast system,according

175、 to the needs of different scenarios.Withconsiderations given to the actual needs of program content creation,the shooting solutionsintroduced here are mainly 180 2D or 3D;360 is more suitable for panoramic images without asubject(such as landscape and humanities VR documentaries)or panoramic camera

176、 positions forshowing the environment at the broadcast site.Aprofessional 8K VR single-camera live streaming system can consist of the following parts:29/37Figure 3.7.1 Schematic Block Diagram of a Single-Camera 180 8K VR Live Streaming SystemThe system,in support of single-lens 2D(4K4K 50P)or dual-

177、lens 3D(4K4K2 50P),canrealize all parameter control and image processing.For 8K 3D VR camera signals,it can completereal-time VR ERP correction(stretching the fisheye 180 image into equi-rectangular projectionimage),3D binocular image error correction(such as digital correction of optical axis verti

178、calerror),camera tone control(image quality optimization processing),and remote controladjustment of camera parameters(such key parameters as camera aperture,white balance,ISO,and audio and video delay).In addition,according to the user needs of VR live production,this system can also directlyconnec

179、t to the VR head-mounted display and convert the 8K VR signal to the VR head-mounteddisplay for real-time monitoring of high quality,helping the director to confirm the VR shootingeffect and achieve what you shoot is what you see.The single-camera live streaming production method is more suitable fo

180、r small and medium-sized stage performances with fixed performance directions or activities in similar scenarios(suchas commercial e-commerce live streaming,and medical industry education training).In this case,the viewer from one viewpoint can basically both get the overall picture of the scene and

181、 payattention to the details.Figure 3.7.2 Single-Camera 8K VR Live Streaming Camera Design and Shooting Effect(MiguLets Sing and Dance 8K VR Live)Based on the single-camera system,it can be expanded into a multi-camera large-scale 8K2D/3D VR live broadcast system,generally used for large-scale sport

182、s events,large-scale30/37performance activities.Figure 3.7.3A Small System for 3-Camera 8K 2D 360 VR Live Streaming of Small Programs(Front 180 3-Camera Switching+Rear 180 Virtual Packaging Stitching into 360 Panorama)Figure 3.7.4 Design of 3-Camera VR Broadcast Positions in Basketball GamesFigure 3

183、.7.5 Large-Scale Multi-camera Immersive Live Broadcast System Supporting 8K 3D VR50P Production31/37Figure 3.7.6 Camera Design of 6-Camera 8K 3D VR Live Streaming in the M-Zone StreetDance CompetitionAs long as multiple sets of 8K 3D 180 VR camera live streaming equipment are connectedto a broadcast

184、 8K switcher,it can perform professional 8K VR live streaming production.As the 8K 3D VR signal standard is 76803840(aspect ratio of 2:1),the 8K VR cameracontrol processing unit needs to use letterboxing(with black bars appearing at the top and bottom)during the signal output to convert the output f

185、ormat into the radio and TV 8K resolution standardof 76804320(16:9),and only after this conversion can the signal transmission and production becarried out.However,at the end of the front-end live streaming system,it is recommended toadjust the final output VR signal format back to the 76803840(aspe

186、ct ratio of 2:1)signal on theencoder side for the adaptation to the VR head-mounted display terminal to avoid deformation ofthe terminal display screen due to the improper output format.Generally,according to therequirements of live program presentation and terminal output effects,2D camera shooting

187、 images(such as PGM or close-up shots of 2D live streaming)and information such as event datatemplates or performance program introductions can be converted into VR-adapted sphericaleffects through a VR processor,and then embedded into the VR live streaming screen,orsuperimposed on the rear 180 virt

188、ual packaging screen generated by the VR signal processingunit,and then stitched with the front 180 real shots to form a 360 VR panoramic video image.Figure 3.7.7 Link Diagram of 8K 3D VR Format ConversionThe produced 8K VR signal is connected to two 8K encoders in the main and backupchannels to lau

189、nch the 8K streaming signal.The current 8K VR live streaming scenario under the32/37cooperation of 4K Garden and Migu,is mainly based on the Secure Reliable Transport(SRT)forencoding and transmission.Under the 8K 50P VR video specification,the transmission rate is 80Mbps to ensure sufficient 8K pict

190、ure quality.Figure 3.7.8 Handling Process for 8K 3D VR Live Streaming SignalsThe Real-Time Messaging Protocol(RTMP),a protocol for streaming media transmission,isthe protocol standard widely used in the HD and 4K stages on the public network,with highplatform support.It is usually used in a relative

191、ly stable network environment and its requirementsfor network quality are high.For 8K encoding live streaming,its high resolution and high bit ratemay require higher bandwidth and more powerful server support.Moreover,RTMP may haveproblems under unstable network conditions.Although RTMP does not set

192、 a fixed upper limit onvideo bit rates,it may be limited when faced with very high bit rates,especially under unstablenetwork conditions.In addition,as RTMP may have a large delay during transmission,it may notbe suitable for application scenarios that require low latency.SRT is designed to provide

193、reliable streaming media transmission under unstable networkconditions,including various technologies,such as error correction,dynamic bandwidthadjustment,and retransmission mechanism.These features make SRT more stable and robustduring processing of videos with a high bit rate.Compared with RTMP,SR

194、T may be more suitableto transmit the 8K high-resolution video with a high bit rate.This is because SRT has the ability tomaintain high transmission quality even under poor network conditions,and can adapt to differentnetwork conditions by flexible adjustment of bandwidth.Even in unstable network co

195、nditions,SRT has the potential to better maintain the transmission stability and quality of videos with a highbit rate.4.Future TVs Demand for 6G MobileNetworksCompared with 5G networks,the most significant advantage of 6G mobile networks is thejump of the users actual experience rate to the Gbps le

196、vel,including lower latency,which cannot33/37be provided by 5G networks.User Experience RateLatency=1Gbps2.5msIn the future TV application scenario,the level of real-time data processing andtransmission required by the front-end production domain is far greater than the needs of usersreception and v

197、iewing.As Broadcast audio and video signal production has extremely highrequirements for signal quality and transmission processing delay,the past live broadcast systemsbasically adopted uncompressed video signal specifications,with the signal data bit rates of HD50i,HD 50P,4K 50P and 8K 50P under t

198、he 10-bit 4:2:2 standard reaching 1.5 Gbps,3 Gbps,12Gbps and 48 Gbps respectively.As the amount of data is so large that wireless transmission cannotbe realized by 5G or microwave before the 6G era,a high-compression encoding and decodingprocess must be used to reduce the data volume for transmissio

199、n.However,the costs of thisprocess are degradation of image quality,and the additional increase in the encoding and decodingprocessing delay caused by the high-compression complex processing algorithm.As a result,theactual total delay generated for transmission using 5G radio access network(RAN)may

200、farexceed the theoretical value or ideal expectation.The actual user experience rate indicators brought by the 6G mobile network is greater than 1Gbps,which means that there is no need to use complex high-compression encoding,especiallyfor 4K or even 8K videos with numerous data.A pre-compression ra

201、tio of about 10:1 can be usedto reduce the actual data volume to 1.2 Gbps and 4.8 Gbps.For example,the JPEG-XS pre-compression encoding method can ensure both high-quality images and high encoding efficiency,while also realizing lower encoding delays.The JPEG-XS,a new international standard develope

202、d by the Joint Photographic ExpertsGroup(JPEG),can achieve visual lossless compression in the low compression ratio.JPEG-XS isan intra-frame shallow compression coding algorithm based on the wavelet algorithm using the ST2110-22 standard.It can support up to 16-bit precision,120FPS,and 8K resolution

203、.JPEG-XSfeatures low encoding complexity and strong hardware platform affinity,efficiently supported byall current hardware platforms.Meanwhile,JPEG-XS mainly encodes within the frame withoutperforming predictions in the time dimension,which will greatly reduce the latency in spite of thecompression

204、 ratio that is not large.These advantages can improve editing efficiency during the production process and reducesome problems caused by the large amount of 8K file data during transmission(such as the 12-SDIx4 interface or 100G IP stream required for 8K uncompressed processing).They can alsotransmi

205、t 8K videos at 10G bandwidth,while ensuring high performance,high stability and multi-layer real-time editing capabilities.Figure 4.1 Transmission Link of 6G+4K/8K JPEG-XS Wireless Camera ShootingWe can say that if the transmission is stable and reliable,the combination of 6G+4K/8K34/37JPEG-XS has a

206、 qualitative change for high-level professional live production requirements,compared to previous microwave and other radio signal transmission solutions.At the same time,the combination can narrow the image quality gap between wired and wireless cameras to thepoint where the difference is negligibl

207、e.In addition,it also provides more creative possibilities forthe design of lightweight production and broadcast systems and more special application wirelessshooting scenarios.In the 6G scenario defined by ITU,future TV belongs to the application scenario ofimmersive communication,which is also a d

208、erivative scenario of 5G eMBB.In the 6G era,applications including 8K live streaming are expected to be better used in future TV with theincreasing network speed and the optimization of transmission delay.Figure 4.2 Six Scenarios and Four Principles of IMT-2030For the back-end user application scena

209、rio requirements of future TV,the significantimprovement in the bandwidth capacity of 6G RAN can also further reduce the decoding andprocessing burden of data streams with a high bit rate such as 8K,3D,VR,and XR on the terminalside.For the 8K resolution UHD video,8K 2D and 3D VR videos,4K 3D videos,

210、and even futurehigher-resolution standard 2D/3D/VR and other video content,their data volume is several timesthat of the conventional 4K 50P video.If H.265 high-compression encoding is used,such ascompressing the 8K 50P video to about 100 Mbps,although it can be controlled within the 5Guser experien

211、ce rate range,the requirements for the hardware decoding capabilities on the terminalside are high as well.Despite the data volume seeming not large,the encoding and decoding loadleads to an increase in hardware performance and cost,which is not conducive to the actualcontent dissemination and popul

212、arization.Similarly,interactive experiences in 3D and VR scenarios have high requirements for lowlatency.The lower latency and high bit rate brought by 6G will help reduce the complexity of dataprocessing and better support the design and development of more complex interactive scenarios.In short,6G

213、 mobile communications can support Internet video coding and decoding on theuser side,and re-select a new balance point in terms of the compression rate,processing power35/37and efficiency,delay,data volume and final effect,to achieve smaller compromises on the finaleffect,and create a more ideal au

214、dio-visual experience for the viewer.5.ConclusionIf 5G wireless communication has promoted the rapid popularization and technologicalupgrading of UHD video applications,VR and other scenarios,then on this basis,6G will makefurther viewers and expansion to bring audiences a more immersive audiovisual

215、 experience basedon new terminals(new generation HD VR head-mounted display terminal and naked-eye 3Ddisplay terminals).Future TV based on different terminal media adopts a variety of creation methods andproduction techniques to meet different types of user experience needs and provide greaterfreedo

216、m in choices.The sense of immersion and presence is the ultimate goals of the audiovisualexperience.The introduction and support of 6G technology and immersive communicationscenarios,just like the empowerment of UHD technology by HDR and WCG,will add wings tothe technological upgrade of the audiovis

217、ual experience and open up the bottleneck of the originallink,bringing unlimited possibilities.We are also looking forward to the improvement,implementation and popularization of 6Gwireless communication technology in the near future,while the video technology and applicationscenarios related to fut

218、ure TV will continue to be polished and upgraded.Based on the original5G+UHD,5G+VR and other scenarios,it expects more disruptive breakthroughs,which will trulybenefit the times and meet users high-quality experience needs.36/37References:1.Overview of the 8K JPEG-XS Production and Broadcasting Full

219、 Process Test,China MediaGroup,Yan Li and Bin Liu,20222.Technical Requirements for the Production and Broadcasting of 8K UHD TV Programs(Interim),China Media Group,January 20213.Future Vision 2030-2040,NHK Science&Technology Research Laboratories4.ITU-R WP5D Completed the Recommendation Framework for IMT-2030(Global 6G Vision),Huawei,Yan Chen,Peiying Zhu,Wen Tong37/37Participants4K GardenLu YuAcademy of Broadcasting Science(ABS)Yu ZhangKey Laboratory of Media Convergence and Communication,Communication University of China(CUC)Tao LinQualcommYiqing Cao


注意事項

本文(全球6G技術大會:2024年“6G+未來電視”視頻應用場景需求與技術解析白皮書(英文版)(38頁).pdf)為本站會員(新***)主動上傳,地產文庫僅提供信息存儲空間,僅對用戶上傳內容的表現方式做保護處理,對上載內容本身不做任何修改或編輯。 若此文所含內容侵犯了您的版權或隱私,請立即通知地產文庫(點擊聯系客服),我們立即給予刪除!




主站蜘蛛池模板: 武川县| 萍乡市| 舒兰市| 南阳市| 开封县| 禹州市| 泰州市| 老河口市| 唐海县| 会泽县| 马鞍山市| 醴陵市| 萨迦县| 玛多县| 太保市| 泉州市| 博罗县| 望谟县| 天峻县| 眉山市| 昆明市| 钟山县| 万山特区| 漾濞| 霸州市| 孟连| 府谷县| 图木舒克市| 吉首市| 贞丰县| 朝阳县| 明光市| 成武县| 永胜县| 濮阳县| 潮安县| 萝北县| 东乡族自治县| 上林县| 西安市| 淮阳县|