Dear Professor, Thank you very much for your presentation. I am working on interference management and want to know more about the interference management in multi-cell network in current industrial and academic area. Can you please give some link based on this topic? And also i am following your videos a lot, can yo please some session for interference management in multi-cell networks. Thank you.
Thanks, just a short question about a slide at the beginning (symbolduration). Do we know how long such an intervall is with respect to the periode? In the picture it is always exactly one periode. Regardless of the bandwith, it is always exactly one periode, yes? Thank you.
This is a good question! The illustration is a simplification. The precise statement is that for a bandwidth of B Hz, we can transmit B digital symbols per second. Each digital symbol is mapped to a waveform (like the four illustrated in the video) and those waveforms will in general be partially overlapping in time, but are designed and time shifted so that at the exact time instance where the receiver samples the received signal, only one symbol is observed at a time.
@@WirelessFuture Many thanks, so can we let this periode just be half the wavelength and sample more often (here: doubling the sample rate)? Could this double the date rate at the same bandwith? I assume not, otherwise it would have been already done. (Perhaps the sampling of such a short periode is technical unfeasible due to symbol overlapping or such approaches lead automatically to a doubling of the bandwith). Thanks.
prof, you may know about compression codecs as you know more and more people use mobile devices which has small screen size (in other word all you need on a 6 inch screen is FHD) and when VVC becomes industry standard you need only 5-10 Mbps for FHD streaming do you think still we need Gbps cellular connectivity?
I agree with you! The data traffic continues to grow by 30-40% per year but this isn't because we need higher and higher speeds per device for video streaming and other data-intensive applications, but because we use such applications more frequently on more devices. Hence, future networks need to support higher sum capacity and more uniform speeds, while higher peak rates is of little importance. This is something that I've been talking about in previous videos as well: ua-cam.com/video/XTFu4usUb4I/v-deo.html (1 minute) ua-cam.com/video/zTL7bntcosc/v-deo.html (44 minutes) That said, some people believe that AR/VR applications will change the game, and require 1+ Gbps as a minimum.
The last chapter of our textbook arxiv.org/abs/2108.02541 is outlining different resource optimization problems, such as power allocation, pilot assignment, and AP clustering. Since then, some people have also looked into how to take constraints from the C-RAN architecture into account. arxiv.org/pdf/2202.09254
Dear Emil, here a long time follower of your channel. I would like to know your opinion on something, if possible: I´ve had since long ago a craving for a more fundamental understanding of wireless communication, as I rushed into the telecom industry (I work for a network vendor) after getting my degree and now I feel more like a technician rather than a solid expert. My question is whether it is feasible or realistic to self-learn this material in the little spare time left after job and family obligations, or whether it would be more reasonable to pursue something more exciting but less ambitious within the job, even if less research-oriented/theoretical. One of my main concerns is the significant math, signal processing, communications, etc. background I would have to refresh, since I am very rusty on those fundamentals, so it might take quite some years before I am able to for instance understand well a book on MIMO. Any suggestions or tips from your experience working and teaching this material would be highly appreciated.
I think it should be possible to get an intuitive understanding of these things by slowly studying these topics. When it comes to MIMO, there is a Massive MIMO handbook from Ericsson that you can start by reading: www.ericsson.com/en/ran/massive-mimo Next year, I will release an introductory book on the topic with mathematical details and recap of the fundamental theory. It might be interesting for you. There is also the website Brilliant.org, where one can learn basic theory in STEM using interactive examples. It is a paid subscription services. I’m not associated with it and haven’t tried it, but it looks promising.
Hello Professor. Is there really a need for a fully distributed signal processing algorithm in the cell free mimo framework? (Signal processing is done completely on the AP side) I think the difference between the cell free mimo framework and other frameworks is that it has a cpu as the core of signal processing. I've been thinking about this question: if we emphasize fully distributed signal processing algorithms in cell free mimo research, does it make cell free lose its character?
The first papers on cell-free Massive MIMO focused on the fully distributed case, to make use of local processing power at the APs and avoid passing around CSI. However, I agree with you. The performance losses from the distributed case are too large compared to the centralized case. This is why this presentation also focused on that case, and how to implement the centralized processing sequentially to reduce the fronthaul signaling.
@@WirelessFuture Thank you very much for your presentation and answers. Moreover, the method you provided in your paper "Learning-Based Downlink Power Allocation in Cell-Free Massive MIMO Systems", which uses neural networks combined with domain knowledge to approximate centralized power allocation in a distributed manner, inspired me a lot. In my study of signal processing using deep learning for cell-free mimo, such as uplink data combining, due to the nonlinear and black-box characteristics of deep learning, only link-level simulation can be achieved, and it is difficult to obtain a closed-form expression of SE. Therefore, it isn't easy to compare the performance of SE with linear methods. Is there any good way to solve this problem?
@@Wireless-AI Closed form expressions can seldom be obtained, and when they exist it is only for simple method such as MR that isn’t practically interesting. In that paper, we are running Monte Carlo simulations to compute most of the performance expressions, except those related to MR. The important thing is to find a way to compute the performance using expectations and then one can Monte Carlo simulate all the expectations that cannot be computed analytically.
The idea of DECT system is there by Ericsson MD110 PBX and other vendors since year 2000, which is quite similar to free cell design (APs distribited with one CPU), but I guess they don't use the right algorithmic send/receive technoques and that performance dosen't inspired Cellular telephone designers.
Dear Professor,
Thank you very much for your presentation.
I am working on interference management and want to know more about the interference management in multi-cell network in current industrial and academic area. Can you please give some link based on this topic? And also i am following your videos a lot, can yo please some session for interference management in multi-cell networks. Thank you.
Thanks, just a short question about a slide at the beginning (symbolduration). Do we know how long such an intervall is with respect to the periode? In the picture it is always exactly one periode. Regardless of the bandwith, it is always exactly one periode, yes?
Thank you.
This is a good question! The illustration is a simplification. The precise statement is that for a bandwidth of B Hz, we can transmit B digital symbols per second. Each digital symbol is mapped to a waveform (like the four illustrated in the video) and those waveforms will in general be partially overlapping in time, but are designed and time shifted so that at the exact time instance where the receiver samples the received signal, only one symbol is observed at a time.
@@WirelessFuture Many thanks, so can we let this periode just be half the wavelength and sample more often (here: doubling the sample rate)? Could this double the date rate at the same bandwith? I assume not, otherwise it would have been already done. (Perhaps the sampling of such a short periode is technical unfeasible due to symbol overlapping or such approaches lead automatically to a doubling of the bandwith). Thanks.
@@knutlohmann8205 As you say, doubling the sample rate will automatically doubling the bandwidth. This is an instance of the sampling theorem.
prof,
you may know about compression codecs
as you know more and more people use mobile devices which has small screen size (in other word all you need on a 6 inch screen is FHD) and when VVC becomes industry standard you need only 5-10 Mbps for FHD streaming do you think still we need Gbps cellular connectivity?
I agree with you! The data traffic continues to grow by 30-40% per year but this isn't because we need higher and higher speeds per device for video streaming and other data-intensive applications, but because we use such applications more frequently on more devices. Hence, future networks need to support higher sum capacity and more uniform speeds, while higher peak rates is of little importance. This is something that I've been talking about in previous videos as well:
ua-cam.com/video/XTFu4usUb4I/v-deo.html (1 minute)
ua-cam.com/video/zTL7bntcosc/v-deo.html (44 minutes)
That said, some people believe that AR/VR applications will change the game, and require 1+ Gbps as a minimum.
hi profesor,i want to be working my thesis on resource optimization in downlink cell free massive mimo sysetm please tell me direction?
The last chapter of our textbook arxiv.org/abs/2108.02541 is outlining different resource optimization problems, such as power allocation, pilot assignment, and AP clustering. Since then, some people have also looked into how to take constraints from the C-RAN architecture into account. arxiv.org/pdf/2202.09254
@@WirelessFuture Thank you profesor!
Dear Emil, here a long time follower of your channel. I would like to know your opinion on something, if possible: I´ve had since long ago a craving for a more fundamental understanding of wireless communication, as I rushed into the telecom industry (I work for a network vendor) after getting my degree and now I feel more like a technician rather than a solid expert. My question is whether it is feasible or realistic to self-learn this material in the little spare time left after job and family obligations, or whether it would be more reasonable to pursue something more exciting but less ambitious within the job, even if less research-oriented/theoretical. One of my main concerns is the significant math, signal processing, communications, etc. background I would have to refresh, since I am very rusty on those fundamentals, so it might take quite some years before I am able to for instance understand well a book on MIMO. Any suggestions or tips from your experience working and teaching this material would be highly appreciated.
I think it should be possible to get an intuitive understanding of these things by slowly studying these topics. When it comes to MIMO, there is a Massive MIMO handbook from Ericsson that you can start by reading: www.ericsson.com/en/ran/massive-mimo
Next year, I will release an introductory book on the topic with mathematical details and recap of the fundamental theory. It might be interesting for you.
There is also the website Brilliant.org, where one can learn basic theory in STEM using interactive examples. It is a paid subscription services. I’m not associated with it and haven’t tried it, but it looks promising.
Hello Professor.
Is there really a need for a fully distributed signal processing algorithm in the cell free mimo framework? (Signal processing is done completely on the AP side)
I think the difference between the cell free mimo framework and other frameworks is that it has a cpu as the core of signal processing.
I've been thinking about this question: if we emphasize fully distributed signal processing algorithms in cell free mimo research, does it make cell free lose its character?
The first papers on cell-free Massive MIMO focused on the fully distributed case, to make use of local processing power at the APs and avoid passing around CSI. However, I agree with you. The performance losses from the distributed case are too large compared to the centralized case. This is why this presentation also focused on that case, and how to implement the centralized processing sequentially to reduce the fronthaul signaling.
@@WirelessFuture
Thank you very much for your presentation and answers.
Moreover, the method you provided in your paper "Learning-Based Downlink Power Allocation in Cell-Free Massive MIMO Systems", which uses neural networks combined with domain knowledge to approximate centralized power allocation in a distributed manner, inspired me a lot.
In my study of signal processing using deep learning for cell-free mimo, such as uplink data combining, due to the nonlinear and black-box characteristics of deep learning, only link-level simulation can be achieved, and it is difficult to obtain a closed-form expression of SE. Therefore, it isn't easy to compare the performance of SE with linear methods.
Is there any good way to solve this problem?
@@Wireless-AI Closed form expressions can seldom be obtained, and when they exist it is only for simple method such as MR that isn’t practically interesting. In that paper, we are running Monte Carlo simulations to compute most of the performance expressions, except those related to MR. The important thing is to find a way to compute the performance using expectations and then one can Monte Carlo simulate all the expectations that cannot be computed analytically.
@@WirelessFuture Thank you again for your reply.
The idea of DECT system is there by Ericsson MD110 PBX and other vendors since year 2000, which is quite similar to free cell design (APs distribited with one CPU), but I guess they don't use the right algorithmic send/receive technoques and that performance dosen't inspired Cellular telephone designers.
I believe what you refer to is a switching device for managing many phone calls, while the APs are not doing joint transmission at the physical layer.